Podcasts about new kings

  • 171PODCASTS
  • 199EPISODES
  • 49mAVG DURATION
  • 1EPISODE EVERY OTHER WEEK
  • Feb 11, 2025LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about new kings

Latest podcast episodes about new kings

MRKT Matrix
S&P 500 Flat As Powell Urges Rate Caution

MRKT Matrix

Play Episode Listen Later Feb 11, 2025 8:26


MRKT Matrix - Tuesday, February 11th S&P 500 is little changed as Powell urges caution on rate cuts, trade tensions escalate (CNBC) Fed Chair Powell says central bank doesn't ‘need to be in a hurry' to lower interest rates further (CNBC) The Magnificent 7 Are So Last Year. Cash Cows Are the New Kings. (WSJ) Vance Warns U.S. Allies to Keep AI Regulation Light (WSJ) OpenAI's Altman Says Musk ‘Probably Just Trying to Slow Us Down' (Bloomberg) Why This Week's Inflation Report Is Especially Important (WSJ) CPI for January 2025 is Projected to Rise 2.9% Year-Over-Year (FactSet) GM expects to mitigate up to 50% of potential North American tariffs, which Ford describes as ‘chaos' (CNBC) EU Pledges Countermeasure Against Trump's Tariffs on Metals (Bloomberg) --- Subscribe to our newsletter: https://riskreversalmedia.beehiiv.com/subscribe MRKT Matrix by RiskReversal Media is a daily AI powered podcast bringing you the top stories moving financial markets Story curation by RiskReversal, scripts by Perplexity Pro, voice by ElevenLabs

The FreeMind Podcast
Rob Perna Jr on Music, Visual Arts, and Overcoming Challenges | FreeMind Local: West Chester, PA

The FreeMind Podcast

Play Episode Listen Later Feb 5, 2025 47:25


In this episode of FreeMind Local, we sit down with Rob Perna Jr, also known as Funky Starchild, to explore his journey in music, visual arts, and community impact. As the lead guitarist and vocalist of The New Kings of Rhythm and Onyx and Honey, Rob shares his experiences in the West Chester music scene, how he developed his visual arts skills, and his involvement in the Lookaround Music and Arts Festival. Rob also opens up about his legal struggles, incarceration, and how he turned that experience into personal growth by writing over 400 songs and teaching 50 inmates how to play guitar. From balancing music, art, and teaching to navigating life after prison, this conversation is a deep dive into the power of resilience, creativity, and community. Subscribe for more inspiring interviews with musicians, artists, and entrepreneurs. Timestamps: 00:00 – Introduction to Rob Perna Jr 05:30 – The origins of Onyx and Honey 12:45 – Exploring visual arts during quarantine 18:30 – The West Chester arts scene and community spaces 25:10 – Musical influences and band evolution 33:00 – Legal struggles and time in prison 41:50 – Returning to music and rebuilding a career Like, comment, and share this episode to support independent artists. #FreeMindLocal #RobPernaJr #MusicJourney #VisualArts #WestChesterPA #BluesMusic #FunkMusic #CommunityImpact #LocalArtists #MusicPodcast #IndependentMusic

Squared Circle Podcast
New Kings and Queens at the Colosseum! MLW Review with Dos Evil and Marie Shadows

Squared Circle Podcast

Play Episode Listen Later Jan 26, 2025 63:50


Welcome to the Squared Cirlce Podcast with your host Marie Shadows! In this exciting episode of the Squared Circle Podcast, we dive into MLW's Kings of Colosseum PPV with special guest Dos Evil! We break down the entire card, starting with the jaw-dropping ladder match featuring five briefcases and a coveted MLW championship contract. From the chaotic action to the thrilling crowning of a new champion in the main event, we cover it all. Dos Evil kicks off the show by sharing his wrestling journey, his dream roster for a fantasy promotion, and his thoughts on NJPW. Join us for an engaging discussion full of wrestling analysis, fun banter, and plenty of passion for Major League Wrestling. My wrestling planner is out: https://www.barnesandnoble.com/w/one-more-match-wrestling-planner-2025-by-marie-shadows-marie-shadows/1146859694?ean=9798341874039

Query & Schultz Podcast
Episode 220 - 1.21.25: Ohio State new kings, Colts nab a DC, surging Purdue/surviving IU!

Query & Schultz Podcast

Play Episode Listen Later Jan 22, 2025 70:01


We recap last night's College Football Playoff title game and what the result means for Ohio State, Notre Dame, and the Big Ten. Plus, what to expect from Lou Anarumo's defense in Indy, Purdue hitting its stride and IU staying alive, and Schultz's weird audiobook preferences.

GAA on Off The Ball
THE CLUB CHAMPIONSHIP SHOW: New kings to be crowned in All-Ireland finals at Croke Park | Errigal Ciaran and Dr Crokes serve up a clash for the ages

GAA on Off The Ball

Play Episode Listen Later Jan 14, 2025 61:32


Tommy Rooney joined Will O'Callaghan on this week's edition of The Club Championship Show where the lads look back on last weekend's All-Ireland Senior Football semi-finals, including an epic between Errigal Ciaran and Dr Crokes. You'll hear from Errigal Ciaran boss Enda McGinley and star attacker Darragh Canavan while Cuala boss Austin O'Malleys chats about their victory over Coolera Strandhill in the other semi-final. With new names set to be etched on the Andy Merrigan and Tommy Moore Cups, Tommy breaks down some key battles ahead of Sunday's deciders. The Club Championship Show on Off The Ball with AIB - proud sponsors of the All-Ireland Club Championships in hurling, football, camogie and ladies football Check out the hashtag The Toughest throughout the Championship

Bob Ryan & Jeff Goodman NBA Podcast
Are the Thunder and Cavs the new kings of the NBA?

Bob Ryan & Jeff Goodman NBA Podcast

Play Episode Listen Later Jan 6, 2025 54:42


On this episode of the Bob Ryan & Jeff Goodman NBA Podcast, Bob, Jeff, and Gary Tanguay discuss all the biggest stories around the NBA. After the Celtics choke in OKC, are the Thunder the new favorites to win it all? What about Cleveland? Plus, a look at what could have been in the career of Derrick Rose, and the guys give their thoughts on the Patriots firing head coach Jerod Mayo. All that, and much more! The Bob Ryan & Jeff Goodman Podcast is presented by: Prize Picks! Get in on the excitement with PrizePicks, America's No. 1 Fantasy Sports App, where you can turn your hoops knowledge into serious cash. Download the app today and use code CLNS to get $50 when you play $5! PrizePicks, run your game! Go to https://PrizePicks.com/CLNS Gametime! Take the guesswork out of buying NBA tickets with Gametime. Download the Gametime app, create an account, and use code CLNS for $20 off your first purchase. Download Gametime today. What time is it? Gametime! Terms apply. Go to https://gametime.co ! Learn more about your ad choices. Visit megaphone.fm/adchoices

D-Lo & KC
1/2 Hour 1 - New Kings HC Doug Christie

D-Lo & KC

Play Episode Listen Later Jan 3, 2025 52:18


The guys talk about Doug Christie taking over as the interim Head Coach and more on the Kings.

Surviving Paradise
2024 November Governing Body Update: Jehovah's Witnesses Get Two New Kings & more!

Surviving Paradise

Play Episode Listen Later Nov 18, 2024 84:49


Jehovah's Witnesses have a new update and it features insight into how many of them are rotting in prison, natural disaster highlights and... most importantly... an interview with the two new members of the Governing Body!TWITTER: @exjwpodcastINSTAGRAM: survivingparadisepodcast

The Long War - Warhammer 40k Podcast
Ep. 451 - 10th Edition Meta Shift | Genestealer Cults Are the New Kings of the Tabletop

The Long War - Warhammer 40k Podcast

Play Episode Listen Later Nov 14, 2024 72:57


Genestealer Cults and Imperial Guard steal the 40k show leading up to the WCW and new GW preview next weekend. Get Your Precision Dice 5% Off Baron of Dice: Use Code Longwar5 Buy Your 2025 LVO Long War 40k Doubles Tickets Best & Worst 40k Armies in the Warhammer Meta: J15 Games Has Your Game Aides, Tokens & Templates! Get them here:  https://bit.ly/J15GamesTLW Become a veteran of the Long War & get a free t-shirt!  Join our discord  Sign Up For The Next Long War Doubles Event: Long War Gear Shop  Heretic Swag  Essential Hobby Products & Tools List Learn 3D Printing While Working From Home  Buy Wyatt's Miniature:  Table of Contents 00:00 Opening 06:50 Would You Rather 16:50 News & Discussion  54:45 Cali Cup GSC & IG Welcome to , a new place for bringing the hobby back to wargaming! A podcast hosted by Rob Baer, Kenny Boucher & Wyatt Turk. Become a Veteran of the Long War!  http://thelongwar.net/

Slander U Podcast
Texas: the New Kings of the SEC

Slander U Podcast

Play Episode Listen Later Oct 18, 2024 44:25


We talk about TikTok….then we talk football.   Texas vs OU Oregon vs Ohio state and other games  Texas vs UGA!  

Surviving Paradise
Highlights From the 2024 Annual Meeting of Jehovah's Witnesses: New Kings!

Surviving Paradise

Play Episode Listen Later Oct 14, 2024 75:58


With Armageddon imminent, Jehovah's Witnesses are adding to Jesus' royal court on earth with two new king/priests; the Governing Body is lowering age restrictions for their free labor program and a reminder that they have their own museum in Patterson, New York!TWITTER: @exjwpodcastINSTAGRAM: survivingparadisepodcast

Arise and Abide
New Kings for Everyone

Arise and Abide

Play Episode Listen Later Oct 2, 2024 14:51


In this episode of Arise + Abide, Sally and Curtis explore the narrative in 2 Kings 8:7-29. The episode delves into the rise of new kings in both Israel and Judah, the tragic downfall of King Ben-Hadad, and the ominous future foretold by Elisha concerning Hazael's reign. Through deep reflection on the passage, Curtis and Sally discuss God's sovereignty over the nations, His faithfulness to His promises, and the personal cost of discipleship. They highlight the influence of human choices and external voices in leadership, using Jehoram's downfall as an example. Ultimately, they emphasize God's unwavering promise to David and how it is fulfilled in Jesus, the eternal light that leads to life. Tune in for a thoughtful reflection on obedience, leadership, and the eternal faithfulness of God.

KMJ's Afternoon Drive
QUICK HIT: Construction Left Dozens Of New Kings Co. Homes Without Power

KMJ's Afternoon Drive

Play Episode Listen Later Aug 29, 2024 3:59


A case of homebuilders' construction getting ahead of Pacific Gas and Electric led to sparks flying during a Kings County Board of Supervisors meeting on Tuesday as residents await electrification to move into their homes.  Please Like, Comment and Follow 'The Afternoon Drive with Philip Teresi & E. Curtis Johnson' on all platforms:   ---    The Afternoon Drive with Philip Teresi & E. Curtis Johnson is available on the KMJNOW app, Apple Podcasts, Spotify, YouTube or wherever else you listen to podcasts.    --  The Afternoon Drive with Philip Teresi & E. Curtis Johnson   Weekdays 2-6 PM Pacific on News/Talk 580 AM & 105.9 FM KMJ    | Website  | Facebook | Instagram | X | Podcast | Amazon |   -  Everything KMJ   KMJNOW App | Podcasts | Facebook | X | Instagram   See omnystudio.com/listener for privacy information.

Philip Teresi Podcasts
QUICK HIT: Construction Left Dozens Of New Kings Co. Homes Without Power

Philip Teresi Podcasts

Play Episode Listen Later Aug 29, 2024 3:59


A case of homebuilders' construction getting ahead of Pacific Gas and Electric led to sparks flying during a Kings County Board of Supervisors meeting on Tuesday as residents await electrification to move into their homes.  Please Like, Comment and Follow 'The Afternoon Drive with Philip Teresi & E. Curtis Johnson' on all platforms:   ---    The Afternoon Drive with Philip Teresi & E. Curtis Johnson is available on the KMJNOW app, Apple Podcasts, Spotify, YouTube or wherever else you listen to podcasts.    --  The Afternoon Drive with Philip Teresi & E. Curtis Johnson   Weekdays 2-6 PM Pacific on News/Talk 580 AM & 105.9 FM KMJ    | Website  | Facebook | Instagram | X | Podcast | Amazon |   -  Everything KMJ   KMJNOW App | Podcasts | Facebook | X | Instagram   See omnystudio.com/listener for privacy information.

"Talking At The Diner" Podcast Ep. 37 ft. onyx&honey.

"Talking At The Diner" Podcast

Play Episode Listen Later Aug 11, 2024 71:13


As the 2-headed driving force behind onyx&honey., Rob Perna and Nikki DiGiorgio are a musical love story that has become the heart and soul of the music scene in West Chester, PA.West Chester is a bit of an island unto itself.. it's not too far from Philadelphia but it is not what you would call a Philly satellite either. It's a unique scene that has had its fair share of cyclical change and Grateful Dead cover band over-saturation.

Latent Space: The AI Engineer Podcast — CodeGen, Agents, Computer Vision, Data Science, AI UX and all things Software 3.0

Because of the nature of SAM, this is more video heavy than usual. See our YouTube!Because vision is first among equals in multimodality, and yet SOTA vision language models are closed, we've always had an interest in learning what's next in vision. Our first viral episode was Segment Anything 1, and we have since covered LLaVA, IDEFICS, Adept, and Reka. But just like with Llama 3, FAIR holds a special place in our hearts as the New Kings of Open Source AI.The list of sequels better than the originals is usually very short, but SAM 2 delighted us by not only being a better image segmentation model than SAM 1, it also conclusively and inexpensively solved video segmentation in just an elegant a way as SAM 1 did for images, and releasing everything to the community as Apache 2/CC by 4.0.“In video segmentation, we observe better accuracy, using 3x fewer interactions than prior approaches. In image segmentation, our model is more accurate and 6x faster than the Segment Anything Model (SAM).”Surprisingly EfficientThe paper reports that SAM 2 was trained on 256 A100 GPUs for 108 hours (59% more than SAM 1). Taking the upper end $2 A100 cost off gpulist.ai means SAM2 cost ~$50k to train if it had an external market-rate cost - surprisingly cheap for adding video understanding!The newly released SA-V dataset is also the largest video segment dataset to date, with careful attention given to scene/object/geographical diversity, including that of annotators. In some ways, we are surprised that SOTA video segmentation can be done on only ~50,000 videos (and 640k masklet annotations). Model-in-the-loop Data Engine for Annotations and Demo-first DevelopmentSimilar to SAM 1, a 3 Phase Data Engine helped greatly in bootstrapping this dataset. As Nikhila says in the episode, the demo you see wasn't just for show, they actually used this same tool to do annotations for the model that is now demoed in the tool:“With the original SAM, we put a lot of effort in building a high-quality demo. And the other piece here is that the demo is actually the annotation tool. So we actually use the demo as a way to improve our annotation tool. And so then it becomes very natural to invest in building a good demo because it speeds up your annotation. and improve the data quality, and that will improve the model quality. With this approach, we found it to be really successful.”An incredible 90% speedup in annotation happened due to this virtuous cycle which helped SA-V reach this incredible scale.Building the demo also helped the team live the context that their own downstream users, like Roboflow, would experience, and forced them to make choices accordingly.As Nikhila says:“It's a really encouraging trend for not thinking about only the new model capability, but what sort of applications folks want to build with models as a result of that downstream.I think it also really forces you to think about many things that you might postpone. For example, efficiency. For a good demo experience, making it real time is super important. No one wants to wait. And so it really forces you to think about these things much sooner and actually makes us think about what kind of image encoder we want to use or other things. hardware efficiency improvements. So those kind of things, I think, become a first-class citizen when you put the demo first.”Indeed, the team swapped out standard ViT-H Vision Transformers for Hiera (Hierarchical) Vision Transformers as a result of efficiency considerations.Memory AttentionSpeaking of architecture, the model design is probably the sleeper hit of a project filled with hits. The team adapted SAM 1 to video by adding streaming memory for real-time video processing:Specifically adding memory attention, memory encoder, and memory bank, which surprisingly ablated better than more intuitive but complex architectures like Gated Recurrent Units.One has to wonder if streaming memory can be added to pure language models with a similar approach… (pls comment if there's an obvious one we haven't come across yet!)Video PodcastTune in to Latent Space TV for the video demos mentioned in this video podcast!Timestamps* [00:00:00] The Rise of SAM by Udio (David Ding Edit)* [00:03:07] Introducing Nikhila* [00:06:38] The Impact of SAM 1 in 2023* [00:12:15] Do People Finetune SAM?* [00:16:05] Video Demo of SAM* [00:20:01] Why the Demo is so Important* [00:23:23] SAM 1 vs SAM 2 Architecture* [00:26:46] Video Demo of SAM on Roboflow* [00:32:44] Extending SAM 2 with other models* [00:35:00] Limitations of SAM: Screenshots* [00:38:56] SAM 2 Paper* [00:39:15] SA-V Dataset and SAM Data Engine* [00:43:15] Memory Attention to solve Video* [00:47:24] "Context Length" in Memory Attention* [00:48:17] Object Tracking* [00:50:52] The Future of FAIR* [00:52:23] CVPR, Trends in Vision* [01:02:04] Calls to ActionTranscript[00:00:00] [music intro][00:02:11] AI Charlie: Happy Yoga! This is your AI co host Charlie. Thank you for all the love for our special 1 million downloads Wins of AI Winter episode last week, especially Sam, Archie, Trellis, Morgan, Shrey, Han, and more. For this episode, we have to go all the way back to the first viral episode of the podcast Segment Anything Model and the Hard Problems of Computer Vision, which we discussed with Joseph Nelson of Roboflow.[00:02:39] AI Charlie: Since Meta released SAM 2 last week, we are delighted to welcome Joseph back as our fourth guest co host to chat with Nikhila Ravi, Research Engineering Manager at Facebook AI Research and lead author of SAM 2. Just like our SAM 1 podcast, this is a multimodal pod because of the vision element, so we definitely encourage you to hop over to our YouTube at least for the demos, if not our faces.[00:03:04] AI Charlie: Watch out and take care.[00:03:10] Introducing Nikhila[00:03:10] swyx: Welcome to the latest podcast. I'm delighted to do segment anything to our first, one of our very first viral podcasts was segment anything one with Joseph. Welcome back. Thanks so much. And this time we are joined by the lead author of Segment Anything 2, Nikki Ravi, welcome.[00:03:25] Nikhila Ravi: Thank you. Thanks for having me.[00:03:26] swyx: There's a whole story that we can refer people back to episode of the podcast way back when for the story of Segment Anything, but I think we're interested in just introducing you as a researcher, as a, on the human side what was your path into AI research? Why, you know, why did you choose computer vision coming out of your specialization at Cambridge?[00:03:46] Nikhila Ravi: So I did my undergraduate. Degree in engineering at Cambridge university. The engineering program is very general. So first couple of years, you sort of study everything from mechanical engineering to fluid mechanics, structural mechanics, material science, and also computer science.[00:04:04] Nikhila Ravi: Towards the end of my degree, I started taking more classes in machine learning and computational neuroscience, and I really enjoyed it. And actually after graduating from undergrad, I had a place at Oxford to study medicine. And so I was. Initially planning on becoming a doctor, had everything planned and then decided to take a gap year after finishing undergrad.[00:04:28] Nikhila Ravi: And actually that was around the time that sort of deep learning was emerging. And in my machine learning class in undergrad, I remember one day our professor came in and that was when Google acquired DeepMind. And so that became like a huge thing. We talked about it for the whole class. It kind of really stuck.[00:04:48] Nikhila Ravi: And I was kicked off thinking about, okay, maybe I want to try something different other than medicine. Maybe this is a different path I want to take. And then in the gap year, I did a bunch of coding, worked on a number of projects. Did some sort of freelance contracting work. And then I got a scholarship to come and study in America.[00:05:06] Nikhila Ravi: So I went to Harvard for a year, took a bunch of computer science classes at Harvard and MIT, worked on a number of AI projects, especially in computer vision. I really, really enjoyed working in computer vision. I applied to Facebook and got this job at Facebook, and I've now at Facebook at the time, now Meta, and I've been here for seven years, so very circuitous path, probably not a very unconventional, I didn't do a PhD, I'm not like a research, typical research scientist, definitely came from more of an engineering background, but since being at Meta, Have had amazing opportunities to work across so many different interesting problems in computer vision from 3D computer vision.[00:05:50] Nikhila Ravi: How can you go from images of objects to 3D structures and then going back to 2D computer vision and actually understanding the objects and the pixels and the images themselves. So it's been a very interesting journey over the past seven years.[00:06:05] swyx: It's weird because like, I guess with segment anything too, it's like 4D because you solve time, you know, you started with 3D and now you're solving the 4D.[00:06:14] Nikhila Ravi: Yeah, it's just going from 3D to images to video. It's really covering the full spectrum. And actually, one of the nice things has been, so I think I mentioned I, Wanted to become a doctor, but actually Sam is having so much impact in medicine, probably more than I could have ever had as a doctor myself. So I think, you know, hopefully Sam too can also have a similar sort of impact in medicine and other fields.[00:06:39] The Impact of SAM 1 in 2023[00:06:39] swyx: Yeah. I want to give Joseph a chance to comment. Does that also mirror your, we know your story about going into, into vision, but like in the past year, since we did our podcast on Sam what's been the impact that you've seen?[00:06:51] Joseph Nelson: Segment anything. Set a new standard in computer vision, you know recapping from from the first release to present Sam introduces the ability for models to near zero shot meaning without any training identify kind of perfect polygons and outlines of items and objects inside images and that capability previously required a Lots of manual labeling, lots of manual preparation, clicking very meticulously to create outlines of individuals and people.[00:07:25] Joseph Nelson: And there were some models that attempted to do zero shot segmentation. of items inside images, though none were as high quality as segment anything. And with the introduction of segment anything, you can pass an image with SAM1, SAM2 videos as well, and get perfect pixel perfect outlines of most everything inside the images.[00:07:52] Joseph Nelson: Now there are some edge cases across domains and Similar to the human eye, sometimes you need to say, like, which item maybe you most care about for the downstream task and problem you're working on. Though, SAM has accelerated the rate at which developers are able to use computer vision in production applications.[00:08:13] Joseph Nelson: So, at RoboFlow, we were very quick to enable the community of computer vision developers and engineers to use SAM and apply it to their problems. The principle ways of using SAM, you could kind of use SAM as is to like pass an image and receive back masks. Another use case for SAM is in preparation of data for other types of problems.[00:08:37] Joseph Nelson: So, for example, in the medical domain, let's say that you're working on a problem where you have a bunch of images from a wet lab experiment. And from each of those images, you need to count the presence of a particular protein that reacts to some experiment. To count all the individual protein reactions, You can go in and lab assistants to this day will still like kind of individually count and say what are the presence of all those proteins.[00:09:07] Joseph Nelson: With Segment Anything, it's able to identify all of those individual items correctly. But often you may need to also add like a class name to what the protein is. Or you may need to say, hey, like, I care about the protein portion of this. I don't care about the rest of the portion of this in the image.[00:09:26] Joseph Nelson: And, or what it encourages and asks for the user to do is to provide some visual prompting to say, hey, which part, like, Sam says, hey, I can find segments of anything, but which segments do you care about? And so you can do visual prompting, which is kind of a new primitive that Sam introduced. And so at RoboFlow, we have one portion of our tool stack enables users to very quickly label data.[00:09:48] Joseph Nelson: With segment anything, Sam can already provide, hey, here's where I see the outlines of objects. Or a user can click to prompt to say, Hey, here's where the outlines of objects matter. And I recently pulled statistics from the usage of SAM in RoboFlow over the course of the last year. And users have labeled about 49 million images using segment anything on the hosted side of the RoboFlow platform.[00:10:12] Joseph Nelson: And that's like 5 million in the last 30 days alone. And of those images, We did kind of like a rough bafka napkin calculation of like how much time that has saved. Because, again, the alternative is you're clicking individual points to create a polygon, and with SAM you just click once and it guesses where the polygon is.[00:10:32] Joseph Nelson: And I'm sure in a bit we can maybe screen share and show some examples of what this experience is like. And in that time estimation, it's like, On average saves, you know, maybe a dozen or so seconds. And we estimate that this is probably saved on the order of magnitude of 35 years of time for users.[00:10:53] Nikhila Ravi: That's incredible.[00:10:54] Joseph Nelson: So, I mean, basically like in the first, the first year of a model being available, not only can you say, Hey, I'm just going to go use this model, those numbers that like 49 million images. is an estimate directly related to just the hosted side. So imagine all of the users that are self hosting or using SAM for robotics applications or out in the field or offline where it's not even, like, the time or the image counts are tabulated.[00:11:20] Joseph Nelson: And we're probably talking about, you know, just a fraction of the amount of value that's actually being produced for a number of downstream tasks. So to say that the impact has been You know, people use terms like game changing and these sorts of things. It has changed the industry. It's set a new standard.[00:11:36] Joseph Nelson: And with the release of SAM 2, I think we're about to see an acceleration of those capabilities for a lot of reasons.[00:11:42] Nikhila Ravi: That's really great to hear. I think one of the, really SAM 1 was. How many fields actually rely on manual segmentation? I think we're not really exposed to that. Maybe you are at Roboflow because you get to see all the users of these tools.[00:11:57] Nikhila Ravi: But for me, it was, you know, people working on understanding coral reef bleaching or farmers counting their cows and so many different applications that as a researcher. You never get exposed to, but you can have impact towards. So I think that was really awesome to hear.[00:12:15] Do People Finetune SAM?[00:12:15] swyx: So as sort of audience surrogate, who knows less than the two of you, I'm going to ask a really dumb question maybe, but is everyone using stock, a segment, anything?[00:12:23] swyx: Are they fine tuning for the medical domain? Like how on earth could it work for the medical field without fine tuning, right? Like, is that a thing?[00:12:32] Nikhila Ravi: So I mean, I can give a quick perspective from the research side. So one of the things, design decisions we made in SAM was to not have class labels. And so all the data is annotated in a class agnostic way.[00:12:48] Nikhila Ravi: So anything that has a boundary, we consider to be an object. So for example, in any image, there's lots of small objects. We might not know what the name of them are, but they're If you can draw a boundary around it, so you can imagine that we have 11 million images in the SA 1B dataset, we annotated all the objects, there's many, many small objects.[00:13:12] Nikhila Ravi: And so if you think about cells, they're also kind of small objects, there's probably things in the training data. That looked like it, but we didn't have to label it. And so that means that even when you use SAM for applications that it wasn't really trained for, because we didn't restrict it to a certain set of categories, you can actually use it out of the box without custom adaptation.[00:13:35] Nikhila Ravi: But having said that, there's probably certain domains where you need some expertise in order to be able to segment something properly. And for those use cases, Having some extra fine tuning data would probably help, and we've sort of seen that there's some papers that have come out that do this, and, you know, we'd love to hear, Joseph, how people are collecting data with SAM and fine tuning for their use cases.[00:13:59] Joseph Nelson: Once SAM came out, there were adaptations that said, could we use SAM to be, you know, like, efficient SAM? Like, basically take SAM and maybe accelerate it. And then there were domain adapted SAMs, like CellSAM, for example, out of the UC system. Now, what's interesting is, there's, like, adapting SAM to a domain, there's kind of two ways by which that's done.[00:14:21] Joseph Nelson: One is, as you mentioned, like, potentially SAM doesn't have a good concept of The objects of interest. And so you need to do domain adaptation and increase the accuracy for zero shot prediction. The second way though, is it's not fine tuning. It's actually just prompting. It's just guiding the model existing knowledge.[00:14:42] Joseph Nelson: to say which segments you care about. And both those are actually kind of equally important on the application side. You need to, like, a priori ensure that the objects of interest can be correctly segmented and maybe collect data to do that. But even if you had, like, a perfect SAM, like an omniscient SAM that could see every segment in every domain with all pixels perfectly outlined, in production, you would still need some way to Almost like signal to the model what you care about like to paint this picture if you are like a retailer and you are providing Photos of models wearing your clothing on your retail site You may care about you know only the shirt and Sam by default might segment the full person And so there's you know visual prompting that you can do to ensure that you only outline Maybe the shirt for the purposes of swapping in and out different shirts for displaying a given model on a retail page You And so I think what's interesting is that's where, like I wouldn't call it domain adaptation, but that's where, like, when you apply to industry, like, one thing that's particularly important with tooling and enabling SAM to reach its full potential.[00:15:51] swyx: That's really encouraging to hear. I should also think, like, you know, the last time we talked about this, we wanted to, the very natural addition on the class labeling side is the grounding Dino work, right? So I think people, built a grounding SAM and all the other extensions.[00:16:05] Video Demo of SAM[00:16:05] swyx: I think it's, it's probably a good time to cut to a quick demo of SAM2 for people who are, who are tuning in for SAM2 and who better to demo SAM2 than Nikki.[00:16:15] Nikhila Ravi: Sure. So I'll try to narrate what I'm what I'm doing. So audio listeners can also understand. So we have a web demo where anyone can try SAM2 on a video. Here we have a video of someone kicking a football, and I'm going to click on the football to select the object in the first frame. But you can actually select the object in any frame of the video, and this will work.[00:16:40] Nikhila Ravi: The next step is to hit track. So the model's now tracking this in real time. We don't save any of this, it's all running in real time. And now you can see the ball has been tracked throughout the entire video. There's even like a little bit of a challenging case here where the shoe covers the football.[00:16:59] Nikhila Ravi: And actually, you know, the model makes a little bit of a mistake, but that's okay. Because we can actually, here, the model makes a little bit of a mistake here. But you know, we can actually add a refinement click. You can add negative clicks until we get the mask that we want on this frame. And then you can hit track again, and the model will track the object, taking into account the additional information I've provided at that frame.[00:17:25] Nikhila Ravi: We've also added a couple of other fun things you can do on top of the track, like add effects. We can add you know, foreground effects, background effects. And these are just ways of showing how we can use the output from SAM2 as part of other tools like video editing tools. Other systems, so this is just a preview of what you can do with SAM2, but the really cool use cases are places where we might not have even imagined SAM2 being useful.[00:17:54] Nikhila Ravi: So we have a number of examples of things you might want to use it for. There's like underwater videos that it works actually really well for even though we, models never really seen an octopus before and octopus have a lot of moving parts that SAM2 can actually quite effectively. Keep track of all the different tentacles and we can probably see it more clearly if I desaturate the background.[00:18:18] Nikhila Ravi: We can see that actually the tracking of all the different tentacles is Quite accurate. Another challenge with video is that objects can actually become occluded. They can disappear from view and reappear. And a really fun example here is the shuffling cup game, which many of you might have seen. And so here I can click on the ball in the first frame.[00:18:41] Nikhila Ravi: I can also, You know, click on a different cup. And so here, the additional challenge is that there's three cups that look exactly the same. And then there's the ball that will get occluded by the cup. So the ball's no longer visible, the cups are all moving around, they all look the same. But the model actually keeps track of the cup that we selected.[00:19:02] Nikhila Ravi: And, as you can see at the end, here I'll jump to the end so you can see. It actually finds the cup again. I wanted to point out a couple of fun demo UX features that we added that actually really helped with this. So if you can see at the bottom, there's these swim lanes and then the swim lanes, actually the thickness of the swim lane tells you if the object's visible or not.[00:19:22] Nikhila Ravi: So at the beginning, the object's visible,[00:19:25] swyx: the object[00:19:26] Nikhila Ravi: disappears, and then the object comes back. So you can actually visually tell. When the object's being occluded and when it's not, and so it's a nice way of like, knowing if you need to go in and fix the model prediction or not. And so these are some of the UX innovations that we came up with, as well as the model innovations.[00:19:46] Joseph Nelson: One thing that I think is really notable here, there's two things. One is that like, I'd love to have a little bit of a discussion about how the models keeping track of the embedded scene to keep track of the ball and the cup in different places. Put a pause on that for a second.[00:19:59] Why the Demo is so Important[00:19:59] Joseph Nelson: One thing that Meta has put an emphasis on here in a much greater degree than other model releases is the demo experience of recognizing that in addition to having a model that can do zero shot segmentation, you've created a web experience that allows folks to kind of experience both the video effects but the types of UX innovations that encourage usage and adoption.[00:20:23] Joseph Nelson: It's actually kind of reminiscent of The underlying technology of ChatGPT was available prior to the web experience of ChatGPT. Can you talk a bit about why that was a consideration to your team and how you thought about the creation of The demo experience in tandem with training and releasing a new model.[00:20:41] Nikhila Ravi: Yeah, absolutely. I think that's a really great example of how, you know, Chad, GPT was really more of a UX innovation. Obviously it was like a number of research innovations that helped to get to this point. But as you said, like the underlying technology was around for a while. And, you know, putting this UX around as a chat interface helped tremendously with the.[00:21:03] Nikhila Ravi: Adoption and people understanding how it could be useful for real world use cases. And in computer vision, especially, it's so visual. The best way to show how these models work. Is by trying it on your own image or your own video with the original SAM, we put a lot of effort in building like a high quality demo.[00:21:23] Nikhila Ravi: And the other piece here is that the demo is actually the annotation tool. So we actually. Use the demo as a way to improve our annotation tool. And so then it becomes very natural to invest in building a good demo because it speeds up your annotation and improves the data quality and that will improve the model quality.[00:21:43] Nikhila Ravi: With this approach, we found it to be really successful. And obviously externally, people really liked being able to try it. I think, you know, people in fields outside of machine learning would never have tried SAM if we didn't have that demo. And I think that definitely led to a lot of the adoption in, like, diverse fields.[00:22:05] Nikhila Ravi: And so because we saw that with SAM 2, like, the demo was a priority first class citizen from day one. And so we really invested in making that. And I think with SAM2 as well, we wanted to have like a step change in the demo experience. Interactive video segmentation, I think that experience is something that maybe has not had much thought given to it.[00:22:27] Nikhila Ravi: And we really wanted to be like, okay, if we are to design a step changing video segmentation experience, what would that look like? And that really did influence our model. And annotation design as well.[00:22:40] Joseph Nelson: It's a really encouraging trend for not thinking about only the new model capability, but what sort of applications folks want to build with models as a result of that downstream.[00:22:49] Nikhila Ravi: I think it also really forces you to think about many things that you might postpone, for example, efficiency.[00:22:55] Joseph Nelson: Yes.[00:22:55] Nikhila Ravi: For a good demo experience. Making it real time is super important. No one wants to wait. And so it really forces you to think about these things much sooner and actually makes us think about how to, what kind of image encoder we want to use or like other hardware efficiency improvements.[00:23:13] Nikhila Ravi: So those kinds of things, I think, become a first class citizen when you put the demo first.[00:23:19] SAM 1 vs SAM 2 Architecture[00:23:19] Joseph Nelson: That's one thing I was going to ask about, and this is related to the architecture change. So SAM1 and the SAM1 demo experience. You have the encoder that's creating the embeddings of all the potential spaces.[00:23:31] Joseph Nelson: That needs to be run on a GPU. That's a relatively intensive operation. But then the query of those embeddings can be run independently and on a cheaper process. So in the SAM1 demo, the way that it was structured, and also this is the way that we have our SAM tool structured in Robloflow as well, is images go to a GPU to get all the SAM based embeddings.[00:23:53] Joseph Nelson: But then for querying those embeddings, we do that client side, in the browser, so that the user can very quickly, you know, you can move your mouse over and you get the proposed candidate masks that Sam found for that region of the image. In SAM 2 you dropped that in the web demo. And I think that's because you made some notable improvements to the rate at which encoding happens.[00:24:16] Joseph Nelson: Can you talk a bit about what led to those speed increases and, again, how that interplays with providing a fast encryption? user experience for interacting with the model.[00:24:29] Nikhila Ravi: Yeah. So the SAM2 web demo is primarily focused on video. We, we decided to just keep it simple and focus on video and on GitHub, we have a Colab notebook that shows how to run SAM2 on images.[00:24:41] Nikhila Ravi: So if you're interested in using, replacing SAM with SAM2 for images, check out GitHub, but on the SAM2 demo, it's not as straightforward to adopt the same architecture as SAM. For video, because we can't send the per frame image embeddings for an entire video back to the front end. In SAM, each frame embedding was like four megabytes, but if you have a long video and that's like per frame, it would become impossible to send that back to the front end.[00:25:11] Nikhila Ravi: So, SAM 2 actually, in terms of the architecture details, I was actually just looking at this earlier, but SAM1 model was around 630 million parameters. It's a fraction of the size of these large language models, but very small. Actually, SAM2, the largest model, is around 224 million parameters. So it's actually One third the size of the SAM original model.[00:25:38] Nikhila Ravi: So we changed the imaging coder from A-V-I-T-H and SAM to a higher model, which has also developed by by meta. So that definitely was something that helped. And in terms of the efficiency compared to sam, so if we were to run SAM per frame on a video or run SAM two, it's around six times faster to run SAM two versus run SAM per frame.[00:26:03] Nikhila Ravi: A number of things improved the efficiency of SAM2 such that we were actually able to run this entirely on the server and not have any component in the front end. But I am very curious to see who puts this on device, like I'm pretty sure soon we'll see like an on device SAM2 or, you know, maybe even running in the browser or something, so.[00:26:25] Nikhila Ravi: I think that could definitely unlock some of these edge use cases that we were able to make a compelling web demo without having to do that.[00:26:34] swyx: Hugging face is probably already working on Transformers. js version of it, but totally makes sense. I want to talk about more about things from the paper, but I think we're still in this sort of demo section.[00:26:42] Video Demo of SAM on Roboflow[00:26:42] swyx: And so I want to hand it to Joseph for his demo to see what the RoboFlow site looks like.[00:26:47] Joseph Nelson: So I can, I can give some context into one key area that Nicola, you mentioned earlier, which is. Sam has made the decision, both Sam 1 and Sam 2, to be class agnostic in terms of its predictions. And that, you then have the ability to have a generalizable, model for zero shot capability.[00:27:05] Joseph Nelson: However, in a lot of domain applications, you do want the class wise name. And so a lot of the challenge can be adding that class wise name for the, at least the annotation to an experience that we've created. That's one of the key considerations. So I will similarly Share my screen and show an example.[00:27:27] Joseph Nelson: Here, I have a bunch of images, and there's a number of ways that I could annotate things, like I could prompt a large multimodal model with like grounding capabilities, you know, you could outsource it, or I can do manual labeling. And with the manual labeling, this is where we make use of models like segment anything.[00:27:45] Joseph Nelson: to propose candidate masks and make it faster. So we have, you know, this annotation pane and what we call the smart poly tool, which is powered by Segment Anything. This is currently Segment Anything 1. We're accelerating and seeing improvements from similar to what the paper shows of Segment Anything 2 performed better on E3.[00:28:06] Joseph Nelson: Images as well as video, but with a segment, anything I'm able to basically prompt regions of my image of interest. So for example, if like, I wanted to say, I want to like add the drum set. You'll see here that like, the original candidate proposal is just the base drum, but let's say I wanted the whole drum set.[00:28:26] Joseph Nelson: So the UX primitive of being able to add and subtract candidate regions of interest is really intuitive here. And now, great, I have this outline, but in fact what I want is, I want to name that as a class. Because maybe for the model that I'm building, I want to build like a task specific model, you know, like an object detection model or an instant segmentation model.[00:28:50] Joseph Nelson: Or, you know, maybe I'm even using like a multimodal model and I want that multimodal model to refer to regions of interest in the images as a specific thing. And so I think what's, you know, really powerful is, of course, like, I get this really rich zero shot prediction. And here we have our friend Rick.[00:29:10] Joseph Nelson: So I get this really rich candidate set of predictions. But then by adding the class wise label, I can, you know, very quickly make sure that any downstream tasks are aware not just of the segment, but also of the, what is inside that segment. Which actually takes me to A separate point of something that I predict that's probably going to happen and Nikhil, I'm actually kind of interested why maybe your team made a conscious decision to not do this initially with SAM2.[00:29:40] Joseph Nelson: There's been an emergent set of models that are also adding open text prompting capabilities to grounding models. So for example, like you've seen models like Grounding Dino or Owlvit, which, you know, you can do. Even image to image or text to image based prompting to find regions of interest. And maybe maybe I can actually give an example of that even in the context of this same data.[00:30:05] Joseph Nelson: So if I wanted to try out, you know, grounding dino on this same set of images, I could try out, you know, prompting grounding dino for a set of different classes. And what's notable is let's do, I don't know, let's prompt for person and we'll prompt for person and prompt for I don't know, microphone.[00:30:26] Joseph Nelson: NLASC or microphone. Here I can text prompt the image and then the understanding, in this case Grounding Dino's understanding, of where people are in this image allows me to create, in this case, bounding boxes, but, you know, soon you can do segmentations or in tandem with SAM do segmentations. And, you know, we've already seen applications of using SAM2 in tandem with models like Grounding Dino or Florence 2.[00:30:54] Joseph Nelson: So that people can basically text prompt and then get the benefits of the zero shot segmentation at the same time as getting the open form querying. And in doing so, you know, we maintain a framework called like autodistill so like folks can very quickly, you know, bring some images and then using autodistill to find some ontology and then prompt and say what you want from that ontology.[00:31:19] Nikhila Ravi: So you already do this for video as well?[00:31:21] Joseph Nelson: You can apply videos or groups of images, yes. So this is using a project called Autodistill. And the concept of Autodistill is, use a base model, like a big base model, which could be like SAM or Grounding Dino, and then you pass a directory of images, which also could be video, broken into individual frames, and you pass an ontology as well.[00:31:43] Joseph Nelson: So an example I was just showing was like the hello world we have, which is like a shipping container. And then the combination of the grounding capabilities of, in the example I was showing, Florence 2 plus SAM, looks for the concept of container, and then SAM does the rich segmentation of turning that concept of container into the candidate proposal of the region, so that a user could just say, hey, I want all the shipping containers, run this across a bunch of images or video frames, And then get back the class wise labels plus the regions of interest.[00:32:17] Joseph Nelson: And this feels like a natural extension. And in fact, like the open form grounding capabilities between SAM1 and SAM2 became something the field was broadly doing. So I'm curious, like, from your perspective, one of the things I thought maybe SAM2 would do is actually add this capability natively. So I'm curious to hear, like, the conscious decision to say, hey, we want to continue to be class agnostic.[00:32:39] Extending SAM 2 with other models[00:32:39] Joseph Nelson: We don't want to add yet maybe open form text prompting as a part of finding the segments and parts of images. And I'd love to hear about like the decision to think about it that way. And if you are encouraged or if you want kind of like what's happening here where people are naturally combining these capabilities as something that you would expect and encourage to happen despite not having it.[00:33:00] Joseph Nelson: In the base model itself.[00:33:02] Nikhila Ravi: Yeah, it's a great question. So I think it's really cool that the community is taking SAM and taking SAM 2 and building on top of it and coming up with cool applications. We love to see that. That's exactly why we open source our work. And then in terms of why we didn't put it into SAM 2, so as you've probably seen with SAM and SAM 2, it's a fairly narrow problem.[00:33:25] Nikhila Ravi: But we really tried to make it a step change in the capability. And so with each version, we are trying to limit the focus on one thing that we can know we can do really well. And in this case, like the first SAM, it was class agnostic segmentation, but can we do it so well that it's effectively solved?[00:33:47] Nikhila Ravi: And similarly, can we do that same thing, but with Video segmentation. So one step at a time, we are working on each of these problems one at a time so that we can actually deliver something that's really world class and step changing.[00:34:03] Joseph Nelson: So does that mean SAM 3 will have the text prompting? Problem is like the next challenge.[00:34:09] Nikhila Ravi: Who knows, who knows? Maybe the community will, will we'll build that too. So[00:34:15] Joseph Nelson: it makes sense to like very narrowly do something very well. And that's, I think, proven to be well accomplished.[00:34:21] Nikhila Ravi: It's like taking the, the, both the data, the model and the demo, and how can we push all three towards solving one thing really well?[00:34:30] Nikhila Ravi: So we found that. That's like a good recipe and that's what we've limited the focus of these, of each of these models.[00:34:38] swyx: This development reminds me of how, you know, when you do, and you break out the interpretability of ConvNets and you can see like, Oh, this is the edge detection one. I feel like SAM is the edge detection version equivalent.[00:34:51] swyx: And then you build up to whatever the next feature is on top of that.[00:34:54] Limitations of SAM: Screenshots[00:34:54] Joseph Nelson: Can I bring up one? Limitation of SAM. So like we've like even SAM one, SAM two, and the monitor is released at 4 PM Pacific on Monday. We're recording this on 11 AM Pacific on, on, on Thursday. So the, it's very fresh for a lot of the capabilities and.[00:35:09] Joseph Nelson: It is so clear that it is a stepwise change in the capability that, Nikhila, you mentioned your team wants to do, which is extend SAM's zero shot class agnostic capability to video, like, A plus, kind of mission accomplished. One thing that's interesting is finding, like, domain problems where there might be still domain applicability and domain adaptation that is available.[00:35:32] Joseph Nelson: One benchmark that we introduced at CBPR is this thing called RF100, which is like, seven different domain type problems that the industry commonly is working on in vision, like underwater document processing, aerial examples, medicine examples. And one place where interestingly segment anything maybe less performant than other models is handling screenshots.[00:35:57] Joseph Nelson: For example, like a lot of folks that are building agents to interact with the web are particularly interested in that challenge of given a screenshot of a computer, what are all the buttons. And how could I autonomously navigate and prompt and tell it to click? And I can show an example of like maybe what, how like Sam kind of performs on this challenge just to outline some of the context of this problem.[00:36:23] Joseph Nelson: But I'm curious like how you think about limitations like this and what you would expect to want to be the case. So here I just have a notebook where I run Sam on the source image on the left. Or the source image on the left and then Sam output is on the right. And this is just a screenshot of, of a website where we just grab like the top 100 websites by traffic and grab screenshots from them.[00:36:42] Joseph Nelson: One example of a place where I could see the community improving on Sam, and I'm curious how you think about this challenge and maybe why Sam is less well adapted for this type of problem. Is processing screenshots. So I'll share my screen to give an example for, for viewers that are participating here, you see like an example, a screenshot of a website on the left, and then right is SAM two running on that image.[00:37:06] Joseph Nelson: And in the context of agents, folks usually want to have like, Hey, tell me all of the buttons that a, an agent could press. Tell me like maybe the headlines of the articles tell me the individual images and Sam two behaves perhaps predictably, where it outlines like people in the images and like some of like the, the screen text.[00:37:22] Joseph Nelson: I'm curious, like, how you think about a challenge like this for a model that sees everything in the world, what about handling digital contexts? And Why maybe it could perform better here and how you would expect to see improvement for domains that might have been out of distribution from the training data?[00:37:40] Nikhila Ravi: Yeah, this is a good question. So fair, we don't really build with a specific use case in mind. We try to build like these foundational models that can be applied to lots of different use cases out of the box. So I think in this kind of example, potentially people might want to annotate some data.[00:37:59] Nikhila Ravi: Fine tune on top of what we release. I think we probably won't build things that are very custom for different use cases. I think that's not a direction we'll go in, but as you said, like the model is an annotation tool to improve the model. And so I think that's definitely the approach we want to take is we provide the tools for you to improve the model as well as the model itself.[00:38:27] Joseph Nelson: That makes sense. Focus on like as many. Multi or zero shot problems and then allow the community to pick up the torch for domain adaptation.[00:38:34] Nikhila Ravi: Yeah, absolutely. Like, we can't solve all the problems ourselves. Like, we can't solve all the different domains. But if we can provide a sort of base hammer tool, and then people can apply it to all their different problems.[00:38:48] SAM 2 Paper[00:38:48] swyx: If you don't mind, I guess we want to transition to a little bit on like asking more questions about the paper.[00:38:53] Udio AI: Sure.[00:38:54] swyx: There's a lot in here. I love the transparency from Meta recently with like LLAMA 3 last week and then, and was it last week? Maybe, maybe a little bit less than last week. But just like just really, really well written and a lot of disclosures, including the data set as well.[00:39:08] SA-V Dataset and SAM Data Engine[00:39:08] swyx: I think the top question that people had on the data set, you know, you release a diverse videos and there was, there's a lot of discussion about the data engine as well, which I really love. And I think it's innovative if you wanted. I think the top question is like, how do you decide the size of data set?[00:39:22] swyx: You know, what were you constrained by? People are asking about scaling laws. You had some ablations, but as a research manager for this whole thing, like how do you decide what you need?[00:39:32] Nikhila Ravi: Yeah. I mean, it's a great question. I think it's, as with all papers, you write them at the end of the project, so we can put these nice plots at the end, but going into it, I think, you know, the data engine design really follows.[00:39:47] Nikhila Ravi: So, this is sort of the model design, how we thought about the task, how we thought of the model capabilities. You can really see it's reflected in the different phases of the data engine. We started with just SAM, we apply SAM per frame. That's like the most basic way of extending SAM to video. Then the most obvious thing to do is to take the output masks from SAM and then provide it as input into a video object segmentation model that takes the mask as the first frame input.[00:40:19] Nikhila Ravi: And that's exactly what we did. We had SAM plus a version of SAM2 that only had mask as input. And then in the last phase, we got rid of SAM entirely and just had this one unified model that can do both image. And video segmentation. And I can do everything in just one model. And we found that, you know, going from each phase, it both improved the efficiency and it improved the data quality.[00:40:46] Nikhila Ravi: And in particular, when you get rid of this two part model, one of the advantages is that when you make refinement clicks, so, You prompt the model in one frame to select an object, then you propagate those predictions to all the other frames of the video to track the object. But if the model makes a mistake and you want to correct it, when you have this unified model, you only need to provide refinement clicks.[00:41:14] Nikhila Ravi: So you can provide maybe a negative click to remove a region or a positive click to add a region. But if you had this decoupled model, you would have to Delete that frame prediction and re annotate from scratch. And so you can imagine for more complex objects, this is actually adding like a lot of extra time to redefine that object every time you want to make a correction.[00:41:39] Nikhila Ravi: So both the data and the data engine phases really follow, like how we thought about the model design and the evolution of the capabilities, because it really helped us to do that. improve the data quality and the annotation efficiency as well.[00:41:54] swyx: Yeah, you had a really nice table with like time taken to annotate and it was just going down and down.[00:41:58] swyx: I think it was like down by like 90 percent by the time you hit stage[00:42:02] Joseph Nelson: three, which is kind of cool. We joke that when SAM 1 came out at RoboFlow, we're like, was this purpose built for our software? Like you have like the embedding, you have the embedding take like a big model and the querying of the embeddings A smaller model that happens in browser, which felt remarkably aligned.[00:42:18] Joseph Nelson: Now hearing you talk about how you think about building models with a demo in mind, it makes sense. Like, you're thinking about the ways that folks downstream are going to be consuming and creating value. So, what felt like maybe a coincidence was perhaps a deliberate choice by Meta to take into account how industry is going to take Seminal advances and apply them.[00:42:36] Nikhila Ravi: Yeah. And it's not just humans. Like it could also be a model that outputs boxes that then get fed into this model. So really thinking about this as a component that could be used by a human or as a component, as part of a, of a larger AI system. And that has, you know, a number of design requirements. It needs to be promptable.[00:42:56] Nikhila Ravi: It needs to be, have the zero shot generalization capability. We, you know, need it to be real time and. Those requirements really are very core to how we think about these models.[00:43:08] Memory Attention to solve Video[00:43:08] swyx: I cannot end this podcast without talking about the architecture, because this is your, effectively the sort of research level, architecture level innovation that enabled what I've been calling object permanence for SAM.[00:43:22] swyx: And it's memory retention. What was the inspiration going into it? And you know, what did you find?[00:43:27] Nikhila Ravi: Yeah, so at a high level, the way we think about extending SAM to video is that an image is just a special case of a video that just has one frame. With that idea in mind, we can extend the SAM architecture to be able to support segmentation across videos.[00:43:45] Nikhila Ravi: So this is a quick video that shows how this works. So SAM architecture, we have the image encoder, we have a prompt encoder, we have a mask decoder. You can click on an image. And that basically is a prompt, we use that prompt along with the image embedding to make a mask prediction for that image. Going to SAM2, we can also apply SAM2 to images because we can, you know, as I said, treat an image as a video with a single frame.[00:44:15] Nikhila Ravi: And so when we, in the SAM2 architecture, we introduce this new memory mechanism that consists of three main components. There's memory attention, there's a memory encoder, and then there's a memory bank. And when we apply SAM2 to images, these are effectively not used. And the architecture just collapses down to the original SAM architecture.[00:44:35] Nikhila Ravi: But when we do apply this to video, the memory components become really useful because they provide the context of the target object from Other frames. And so this could be from past frames. It can be from, there's two types of memory. So there's like the condition, conditional frames or the prompted frames, which are basically the frames at which a user or a model provides input like clicks.[00:45:01] Nikhila Ravi: And then there's like the surrounding frames. And say we use six frames around the current frame as memory of the object. So there's, there's those, those, both those types of memory that we use to make the prediction. Going into a little bit more detail about that, there's like two kinds of memory that we use.[00:45:18] Nikhila Ravi: So one is like spatial memory. So it's like this high resolution memory that captures the spatial details. And then we also have this like longer term object pointer memory that captures some of the sort of higher level concepts. And I think Swyx, you had a comment about how does this relate to sort of context window and LLMs.[00:45:37] Nikhila Ravi: And both of these types of memories have some relation to context window, so they both provide different types of information on the spatial side or in terms of the concept of the objects that we want to track. And so we found that having like six frame length for the spatial memory, Coupled with this longer period of the object pointer memory provides strong video segmentation accuracy at high speed.[00:46:01] Nikhila Ravi: So, as I mentioned, the real time aspect is really important. We have to find this speed accuracy trade off. And one way in which we sort of circumvent this is by allowing additional prompts on subsequent frames. So even if the model makes a mistake, maybe it loses the object. After an occlusion, you can provide another prompt, which actually goes into the memory.[00:46:24] Nikhila Ravi: And so the prompted frames are always in the memory. And so if you provide a prompt on a frame, we will, or the model will always remember what you provided. And so that's a way in which we can sort of avoid some of the model failure cases that actually is a big limitation of current models, current video object segmentation models.[00:46:45] Nikhila Ravi: Don't allow any way to recover if the model makes a mistake. And so, Joseph, going back to your point about the demo, that's something that we found just by playing with these models. There's no way to make a correction, and in many real world use cases, like, it's not going to be a one time prediction, but you actually want to be able to intervene, like, if an LLM makes a mistake, you can actually be like, no, actually do it this way, and provide feedback, and so, We really want to bring some of that thinking into how we build these computer vision models as well.[00:47:16] "Context Length" in Memory Attention[00:47:16] swyx: Amazing. My main reaction to finding out about the context length of eight input frames and six pass frames as their default is why not 60? Why not 600? In text language models, we're very used to severely extending context windows. And what does that do to the memory of your model?[00:47:35] Nikhila Ravi: So I think maybe one, one thing that's different is that the object in video, it is challenging.[00:47:41] Nikhila Ravi: Objects can, you know, change in appearance. There's different lighting conditions. They can deform, but I think a difference to language models is probably the amount of context that you need is significantly less than maintaining a long multi time conversation. And so, you know, coupling this. Short term spatial memory with this, like, longer term object pointers we found was enough.[00:48:03] Nikhila Ravi: So, I think that's probably one difference between vision models and LLMs.[00:48:09] Object Tracking[00:48:09] Joseph Nelson: I think so. If one wanted to be really precise with how literature refers to object re identification, object re identification is not only what SAM does for identifying that an object is similar across frames, It's also assigning a unique ID.[00:48:25] Joseph Nelson: How do you think about models keeping track of occurrences of objects in addition to seeing that the same looking thing is present in multiple places?[00:48:37] Nikhila Ravi: Yeah, it's a good question. I think, you know, SAM2 definitely isn't perfect and there's many limitations that, you know, we'd love to see. People in the community help us address, but one definitely challenging case is where there are multiple similar looking objects, especially if that's like a crowded scene with multiple similar looking objects, keeping track of the target object is a challenge.[00:49:03] Nikhila Ravi: That's still something that I don't know if we've solved perfectly, but again, the ability to provide refinement clicks. That's one way to sort of circumvent that problem. In most cases, when there's lots of similar looking objects, if you add enough refinement clicks, you can get the perfect track throughout the video.[00:49:22] Nikhila Ravi: So definitely that's one way to, to solve that problem. You know, we could have better motion estimation. We could do other things in the model to be able to disambiguate similar looking objects more effectively.[00:49:35] swyx: I'm just interested in leaving breadcrumbs for other researchers, anyone interested in this kind of architecture.[00:49:41] swyx: Like, are there papers that you would refer people to that are influential in your thinking or, you know, have, have other interesting alternative approaches?[00:49:49] Nikhila Ravi: I think there's other ways in which you can do tracking and video. You might not even need the full mask. I think that's it. Some other works that just track like points on objects.[00:49:59] Nikhila Ravi: It really, really depends on what your application is. Like if you don't care about the entire mask, you could just track a bounding box. You could just track a point on an object. And so having the high fidelity mask might not actually be necessary for certain use cases. From that perspective, you might not need the full capabilities.[00:50:19] Nikhila Ravi: of SAM or SAM2. There's many different approaches to tracking, I think I would encourage people to think about like what actually they need for their use case and then try to find something that that fits versus, yeah, maybe SAM2 is too much, you know, maybe you don't even need the full mask.[00:50:37] swyx: Makes total sense, but you have solved the problem that you set out to solve, which is no mean feat, which is something that we're still appreciating even today.[00:50:44] The Future of FAIR[00:50:44] swyx: If there are no further questions, I would just transition to sort of forward looking, future looking stuff. Joseph already hinted at, like, you know, our interest in SAM and the future of SAM, and obviously you're the best person to ask about that. I'm also interested in, like, How should external people think about FAIR, you know, like there's this stuff going on, this llama, this chameleon, this voice box, this image bind, like, how is, how are things organized?[00:51:09] swyx: And, you know, where are things trending?[00:51:11] Nikhila Ravi: Yeah, so in FAIR, we, you know, we have a number of different research areas. I work in an area called perception. So we built vision systems that solve basically, Look at all the fundamental problems in Compute Division. Can we build a step change in all of these different capabilities?[00:51:29] Nikhila Ravi: SAM was one example. SAM2 is another example. There are tons of other problems in Compute Division where we've made a lot of progress, but can we really say that they're solved? And so that's really the area in which I work on. And then there's a number of other research areas in language and in embodied AI.[00:51:49] Nikhila Ravi: And more efficient models and various other topics. So fair in general is still very much pushing the boundaries on solving these foundational problems across different domains. Well,[00:52:07] swyx: fair enough, maybe just outside of fair, just the future of computer vision, right?[00:52:10] CVPR, Trends in Vision[00:52:10] swyx: Like you are very involved in the community. What's the talk of the town at CVPR? Both of you went, who's doing the most interesting work? It's a question for both of you.[00:52:19] Joseph Nelson: I think the trends we're seeing towards more zero shot capability for common examples will accelerate. I think Mutu modality, meaning using, you know, images in tandem with text for richer understanding or images and video in tandem with audio and other mixed media will be a continued acceleration trend.[00:52:43] Joseph Nelson: The way I kind of see the field continuing to progress, the problem statement of computer vision is making sense of visual input. And I think about the world as the things that need to be observed follow your traditional bell curve, where like things that most frequently exist out in the world are on the center of that bell curve.[00:53:05] Joseph Nelson: And then there's things that are less frequently occurring that are in those long tails. For example, you know, as back as like 2014, you have the Cocoa data set, which sets out to say, Hey, can we find 80 common objects in context, like silverware and fridge and these sorts of things. And we also conceptualized the challenge of computer vision in terms of breaking it down into individual task types, because that's like the tools we had for the day.[00:53:29] Joseph Nelson: So that's why, you know, you have the origination of classification, object detection, instant segmentation. And then as you see things continue to progress. You have models and things that need to observe areas in the long tails. And so if you think of the Cocoa dataset as the center of that bell curve, I think of like the long tails, like really edge case problems.[00:53:49] Joseph Nelson: Some of our customers like Rivian, for example, only Rivian knows what the inside of like a Rivian should look like as it's assembled and put together before it makes its way to a customer and they're making custom parts. Right? So how could a model you've been trained on the things that go inside the componentry of producing a vehicle and Andreesen, What's kind of happening with computer vision is you're seeing models that generalize in the middle of the bell curve push outward faster.[00:54:17] Joseph Nelson: That's where you see the advent of like open text models or the richness of understanding of multimodal models. To allow richer understanding without perhaps any training, or maybe just using pre training and applying it to a given problem. And then, there's like, you know, kind of like the messy middle in between those two, right?[00:54:38] Joseph Nelson: So like, Akila kind of talked about examples where SAM does well out of distribution, where like, it finds an octopus, even though there wasn't octopi in the training data. I showed an example where, like, screenshots, where Sam isn't yet super great at screenshots, so maybe that's, like, in the messy middle or in the longer tails for now.[00:54:54] Joseph Nelson: But what's going to happen is there needs to be systems of validating the point of view that I think about, like, tooling to also validate that models are doing what we want them to do, adapting to datasets that we want them to adapt to. And so there's a lot of things on a forward looking basis that allow propelling that expansion of generalizability.[00:55:14] Joseph Nelson: That's for open text problems. That's where scaling up of training, of dataset curation, continues to play a massive role. Something that's notable, I think, about SAM2 is it's, what, 57, 000 videos? 51,[00:55:30] Nikhila Ravi: 000 videos? About 51, 000, yeah.[00:55:32] Joseph Nelson: And 100, 000 internal datasets. That's, like, not Massive, right? And the model size also isn't, you know, the largest, largest model being a couple hundred million parameters.[00:55:43] Joseph Nelson: The smallest model is 38 million parameters and can run at 45 FPS on an A100, right? Like the capabilities of, we're going to see more capable, more generalizable models. Being able to run on a higher wide array of problems with zero or multi shot capability on a faster, a faster rate. And I think the architecture innovations and things like SAM2 of memory, of increasingly like transformers making their way into division and probably blended architectures increasingly too.[00:56:15] Joseph Nelson: So my viewpoint of like on a go forward basis is we will have that bell curve of what humans can see both in the center of that curve and the long tails. And architectural changes allow richer understanding, multi and zero shot, and putting those into systems and putting those into industry and putting those into contexts that allow using them in practical and pragmatic ways.[00:56:38] Joseph Nelson: Nicola, I'd love to hear like your thought and perspective of like how you think the research trends map or don't map to that. And like maybe some of the key innovations that you saw at CVPR this year that, you know, Got you excited about the direction and maybe some promising early directions that you're thinking about researching or pushing the boundaries of further.[00:56:56] Nikhila Ravi: Yeah, I just wanted to actually reply to a couple of things that you said about so actually in video object segmentation, the number of classes. that are annotated in these, and then the size of these datasets are really small. So with SAM, it's, you know, we had a billion masks, we had 11 million images, didn't have class labels.[00:57:17] Nikhila Ravi: But even before that, there were a lot of datasets that have class labels and are annotated. With significantly more with, with like a lot of class labels, whereas in video datasets, the number of class labels are very small. So there's like YouTube VOS, which has 94 object categories, there's Mose, which has around like 30 or so object categories.[00:57:38] Nikhila Ravi: And they're usually like people, there's cars, there's dogs and cats and all these common objects, but not really, they don't really cover a very large number of object categories. And so while Sam learned this general notion of what an object is in an image. These video tracking models actually don't have that knowledge at all.[00:58:01] Nikhila Ravi: And so that's why having this data set is really important for the segment anything capability in video because if you just provide the mask as the input to an off the shelf Video object segmentation model. It might not actually be able to track that arbitrary object mask as effectively as a SAM2 model that's actually trained to track.[00:58:24] Nikhila Ravi: Any object across the entire video. So doing these sort of combining two models together to try to get a capability that will actually only get you so far and being able to actually create that the dataset to enable that anything capability, it was actually really important and we can actually see that when we do comparisons with baselines where we provide some two with the same input mask and the baseline model with the same input mask.[00:58:53] Nikhila Ravi: For example, the t shirt of a person, SAM2 can track the t shirt effectively across the entire video, whereas these baselines might actually start tracking the entire person, because that's what they're used to doing, and isolating it to just one part of the person is not something they were ever trained to do, and so those are sort of some of the limitations.

The Athletic Hockey Show: A show about the NHL
Is Byfield's new Kings deal MacKinnon-esque?

The Athletic Hockey Show: A show about the NHL

Play Episode Listen Later Jul 17, 2024 62:37


On today's Wednesday episode of The Athletic Hockey Show, Gentille and DGB discuss Quinton Byfield's interesting deal with the LA Kings, break down Dom's list of the 10 best contracts in the NHL right now, and answer a bunch of listener questions to close things out. Hosts: Sean Gentille and Sean McIndoe Executive Producer: Chris Flannery Producer: Chris Flannery Learn more about your ad choices. Visit megaphone.fm/adchoices

The Spurs Up Show
Is Texas Ready To Supplant Georgia As The NEW KINGS Of SEC Football?

The Spurs Up Show

Play Episode Listen Later May 23, 2024 35:21


Brad Crawford of 247Sports/CBS Sports joins Chris Phillips of SEC Unfiltered to talk SEC football including top CFB Playoff contenders entering this season, Post-Spring QB rankings, why OU is a "mid-tier" program in the SEC and much more. ⬇️ Support SECU ⬇️ PRIZE PICKS Use code "SECU" to receive a 100% deposit match up to $100. https://app.prizepicks.com/sign-up?invite_code=SECU&af_xp=social&source_caller=ui&pid=SECU&utm_content=SECU&utm_source=partner&shortlink=SECU&utm_medium=influencer&utm_campaign=100depositmatch&utm_term=SECU&c=SECU MYBOOKIE Use code "SECU" at signup to receive a special Welcome Offer from MyBookie. https://www.mybookie.ag/ RHOBACK Use code "SECU" for 20% off your first purchase at https://rhoback.com/. SEAT GEEK Use code "SECU" for $20 off your first purchase at https://seatgeek.com/. Subscribe to SEC Unfiltered, the best SEC podcast on the internet: https://podcasts.apple.com/us/podcast/sec-unfiltered/id1441899352 Website: https://www.secunfiltered.com/ X: ​https://twitter.com/SECUnfiltered Instagram: https://www.instagram.com/secunfiitered/ Facebook: ​https://www.facebook.com/SECUnfiItered Podcast: https://podcasts.apple.com/pl/podcast/sec-unfiltered/id1441899352 Let's get it! Learn more about your ad choices. Visit podcastchoices.com/adchoices

Lunchtime With Roggin And Rodney
5/23 H2: Jim Hiller; CP3 to the Lakers? Pete Sickle?

Lunchtime With Roggin And Rodney

Play Episode Listen Later May 23, 2024 44:33 Transcription Available


New Kings head coach, Jim Hiller joins the show. Could Chris Paul join the Lakers this offseason? The co-founder of Rugy Football Club Los Angeles joins us

ESPN FC
New Kings of Germany

ESPN FC

Play Episode Listen Later Apr 14, 2024 64:59


The FC crew react to Bayer Leverkusen's first ever Bundesliga title and preview where Xabi Alonso's future is headed. Then, they break down shock losses for Arsenal and Liverpool as Manchester City takes over pole position of the Premier League title race. Plus, the guys break down why both sides will be disappointed in not taking their chances and look ahead to the remaining schedules for all three clubs. Learn more about your ad choices. Visit megaphone.fm/adchoices

The Crowded Booth
Is UGA the new kings of the SEC? | Georgia Bulldogs Football

The Crowded Booth

Play Episode Listen Later Jan 16, 2024 29:42


Bryce Koon is joined by On3's Palmer Thombs to recap the latest news surrounding Georgia football Make sure to subscribe to the channel for podcasts, gaming and more! #collegefootball #georgia #georgiafootball WEBSITE: https://thecrowdedbooth.substack.com/ DISCORD: https://discord.gg/dWYnG2MnW5 MERCH: https://www.bonfire.com/store/the-crowded-booth/

Shirtless Plantain Show
SPS Weekend Review - The New Kings of Catalonia - Episode 346

Shirtless Plantain Show

Play Episode Listen Later Dec 11, 2023 75:25


Frankfurt did their best against Bayern, but Girona made the biggest statement on the continent this season with an impressive win away at Barcelona.Join Coach & Deen for a review of those games, a double-shift in the Premier League, bangers all around Europe & a brief Champions League preview.Tap in!

Weekend Shows
Time To Turn to the New Kings of Boston | 'Breaking Boston'

Weekend Shows

Play Episode Listen Later Nov 16, 2023 17:06


From 'Breaking Boston' (subscribe here): The Celtics now join the Bruins as conference top dogs as Tatum lights up divisional rival Philadelphia to push them atop the Eastern Conference. Even with team muscles being down for the count with injuries, the Celtics showed up on the road to cement where they stand as contenders once again in the NBA. To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy Learn more about your ad choices. Visit https://podcastchoices.com/adchoices

The Chase Thomas Podcast
Texas Rangers New Kings Of The AL, Kim Ng Leaves Marlins & Juan Soto To The Red Sox? With Fangraphs' Jon Tayler

The Chase Thomas Podcast

Play Episode Listen Later Oct 19, 2023 49:05


Chase Thomas is the Sports Renaissance Man, Atlanta Sports Guy & VFL. On today's program, Chase is joined by Fangraphs' Jon Tayler to talk about how why the Texas Rangers will be the kings of the AL going forward, Kim Ng leaving the Marlins, if the Red Sox should trade for Juan Soto and a World Series preview between Phillies and Rangers.Host: Chase ThomasGuests: Jon TaylerTo learn more about CT and the pod please go visit: https://chasethomaspodcast.comBy the way, this is a free, independent national sports podcast. To keep it that way, I'm going to need some help from you guys. If you're a fan of the pod and you haven't already, take a second right now and leave the show a 5-star rating and a review on Apple, Spotify or wherever you get your podcasts. It really does help, and it's so quick and easy to do. Thanks, y'all!Keep up with Chase on social media:Follow me on Twitter: https://twitter.com/PodChaseThomasFollow me on Instagram: https://bit.ly/3kFHPDnFollow me on TikTok: https://bit.ly/3JdZ3RF'Like' me on Facebook: https://bit.ly/3ZmURo4 Hosted on Acast. See acast.com/privacy for more information.

The Compound Show with Downtown Josh Brown
The New Kings of Wall Street

The Compound Show with Downtown Josh Brown

Play Episode Listen Later Oct 13, 2023 81:10


On episode 113 of The Compound and Friends, Michael Batnick is joined by Jan van Eck and Andrew Beer to discuss: the bond market, the national debt, private-credit, alts, managed futures, the Sam Bankman-Fried trial, and much more! This episode is brought to you by iShares. To learn more about the iShares Semiconductor ETF (SOXX), visit: https://www.ishares.com/us/products/239705/ishares-semiconductor-etf Check out the latest in financial blogger fashion at The Compound shop: https://www.idontshop.com Investing involves the risk of loss. This podcast is for informational purposes only and should not be or regarded as personalized investment advice or relied upon for investment decisions. Michael Batnick and Josh Brown are employees of Ritholtz Wealth Management and may maintain positions in the securities discussed in this video. All opinions expressed by them are solely their own opinion and do not reflect the opinion of Ritholtz Wealth Management. Wealthcast Media, an affiliate of Ritholtz Wealth Management, receives payment from various entities for advertisements in affiliated podcasts, blogs and emails. Inclusion of such advertisements does not constitute or imply endorsement, sponsorship or recommendation thereof, or any affiliation therewith, by the Content Creator or by Ritholtz Wealth Management or any of its employees. For additional advertisement disclaimers see here https://ritholtzwealth.com/advertising-disclaimers. Investments in securities involve the risk of loss. Any mention of a particular security and related performance data is not a recommendation to buy or sell that security. The information provided on this website (including any information that may be accessed through this website) is not directed at any investor or category of investors and is provided solely as general information. Obviously nothing on this channel should be considered as personalized financial advice or a solicitation to buy or sell any securities. See our disclosures here: https://ritholtzwealth.com/podcast-youtube-disclosures/ Learn more about your ad choices. Visit megaphone.fm/adchoices

GRIDIRON ZEROES
EP.085 - LIONS ARE THE NEW KINGS OF THE NORTH

GRIDIRON ZEROES

Play Episode Listen Later Oct 4, 2023 55:10


ON THIS WEEK'S GZP! Lions recap with the dominating win over the Green Bay Packers! Al and Lucas discuss what this means for the NFC North moving forward. We also jump into the upcoming game with the Carolina Panthers and Al gives his biggest demand yet. To start the show we "clean out the toy bin" since the NFL went full Toy Story mode on us this weekend. Al and Lucas discuss what, "toys" (players, teams) they're either still playing with, putting at the bottom of the toy bin and straight up throwing away. Finally, Follow the Money! Lock of the week has a new stipulation that has Al sweating, so be sure to stay tuned to the end of the show for that. Enjoy!

Club Shay Shay
520 South

Club Shay Shay

Play Episode Listen Later Oct 3, 2023 67:24 Transcription Available


85 South pulled up to 520! The New Kings of Comedy blessed us with some insight on the creation of 85 South, Tour Life, the Evolution of Comedy, and much more!  For bonus episodes, BTS, and early access subscribe to Club 520 on Patreon! https://www.patreon.com/Club520 Like Share & Subscribe Follow us everywhere @club520podcast  #club #volumeSee omnystudio.com/listener for privacy information.

The Volume
520 South

The Volume

Play Episode Listen Later Oct 3, 2023 67:24 Transcription Available


85 South pulled up to 520! The New Kings of Comedy blessed us with some insight on the creation of 85 South, Tour Life, the Evolution of Comedy, and much more!  For bonus episodes, BTS, and early access subscribe to Club 520 on Patreon! https://www.patreon.com/Club520 Like Share & Subscribe Follow us everywhere @club520podcast  #club #volumeSee omnystudio.com/listener for privacy information.

The BAM
MLB 2023 - End of the Year Predictions and the New Kings of the AL West

The BAM

Play Episode Listen Later Sep 26, 2023 64:49


The baseball fellas talk end of the year predictions and more.Contact The BAM Connect with us on Twitter, Instagram, Tik Tok, Facebook, and YouTube! Shoot us an email if you have any questions: thebampod1@gmail.com Find us on all podcast streaming platforms here!

The Herd with Colin Cowherd
Moneyline Monaco - NFL Bets: Fields & Bears new kings, Packers plummet, Lions & Vikings overrated?

The Herd with Colin Cowherd

Play Episode Listen Later Aug 12, 2023 39:11 Transcription Available


Alex Monaco discusses his best bets for the NFC North for the 2023 NFL season. Are Justin Fields and the Chicago Bears primed for a breakout year? Will Kirk Cousins, Justin Jefferson, and the Minnesota Vikings be able to repeat last season's magic? Will Jared Goff and the Detroit Lions live up to the hype after finishing 8-2 over the final 10 games last year? And what should we expect from Jordan Love and the Green Bay Packers now that Aaron Rodgers is in New York with the Jets? #Volume #HerdSee omnystudio.com/listener for privacy information.

Sports Gambling w/ Moneyline Monaco
Moneyline Monaco - NFL Bets: Fields & Bears new kings, Packers plummet, Lions & Vikings overrated?

Sports Gambling w/ Moneyline Monaco

Play Episode Listen Later Aug 12, 2023 39:11 Transcription Available


Alex Monaco discusses his best bets for the NFC North for the 2023 NFL season. Are Justin Fields and the Chicago Bears primed for a breakout year? Will Kirk Cousins, Justin Jefferson, and the Minnesota Vikings be able to repeat last season's magic? Will Jared Goff and the Detroit Lions live up to the hype after finishing 8-2 over the final 10 games last year? And what should we expect from Jordan Love and the Green Bay Packers now that Aaron Rodgers is in New York with the Jets? #Volume #HerdSee omnystudio.com/listener for privacy information.

The Volume
Moneyline Monaco - NFL Bets: Fields & Bears new kings, Packers plummet, Lions & Vikings overrated?

The Volume

Play Episode Listen Later Aug 12, 2023 39:11 Transcription Available


Alex Monaco discusses his best bets for the NFC North for the 2023 NFL season. Are Justin Fields and the Chicago Bears primed for a breakout year? Will Kirk Cousins, Justin Jefferson, and the Minnesota Vikings be able to repeat last season's magic? Will Jared Goff and the Detroit Lions live up to the hype after finishing 8-2 over the final 10 games last year? And what should we expect from Jordan Love and the Green Bay Packers now that Aaron Rodgers is in New York with the Jets? #Volume #HerdSee omnystudio.com/listener for privacy information.

The Corona Diaries
Chapter 167. Tom is striding out in his leather pants for a loaf

The Corona Diaries

Play Episode Listen Later Jul 17, 2023 59:41


In our distinctive haphazard style we finally got back onto a bit of FEAR chat for #167. After quite a deep dive into The New Kings & El Dorado a few episodes back, today was dedicated to Living in Fear. As you will hear I am still not quite sure if the lyric is fanciful, protest full or just a bit naive, but what I will say is there is a lot to unpack.And, if you are purple, and you live in close proximity to the coastal resorts of the UK you might want to get ready - your moment may well be nigh.Right I'm off to see if I can acquire an inverter and a deckchair from Mick & Keef in the village shop.Tiddely-om-pom-pom,hP.S Apols for the fact it is a little late this week, Ant rushed off on holiday before pressing the schedule button. What is the collective for a pair of cabbages?TCD Merch StoreBecome Purple and support the showThe Invisible Man Volume 1: 1991-1997The Invisible Man Volume2: 1998-2014FacebookInstagramWebsite

The Blackest Questions with Dr. Christina Greer
Anthony Anderson & Cedric The Entertainer Are The New Kings Of BBQ

The Blackest Questions with Dr. Christina Greer

Play Episode Listen Later Jul 4, 2023 20:56


They can act, and they can make us laugh, but can they ace The Blackest Questions history exam? Anthony Anderson and Cedric The Entertainer join Dr. Christina Greer to test their Black history knowledge and dish about their new A&E series Kings of BBQ. The pair have been friends for years and have now teamed up to share their friendship, and love of food, including a line of rubs and sauces with more items on the way as they grow their barbecue empire.See omnystudio.com/listener for privacy information.

The Give N Go
Mexico's Downfall & The New Kings of CONCACAF - USA!

The Give N Go

Play Episode Listen Later Jun 21, 2023 57:59


Reynoso & Soltero dive deep into all things USA and Mexico. Debating whether this is the darkest period in Mexico's footballing history while praising the USA for their development over the past 8 years. Who should coach Mexico next? How far will the USA go at the 2026 World Cup? There's lots to dissect, so we hope you enjoy the episode! (00:00) It's hot af in Texas (01:31) What the hell happened to Mexico? (16:46) USA are the Kings of CONCACAF! (26:00) What's the USA's next challenge? (30:11) Is Berhalter the correct choice? (32:55) Why Gio Reyna is the key for the USA (33:53) Who should coach Mexico next? (45:44) What's at stake for Mexico? (50:34) Should Mexicans be worried? (52:21) What if USA disappoints at the World Cup? (54:20) The importance of Copa America 2024 (55:53) How far will USA go at the 2026 World Cup?

PRI: Arts and Entertainment
New Kings and Queens soccer leagues enlist internet stars to revamp sport

PRI: Arts and Entertainment

Play Episode Listen Later May 2, 2023


The game is loosely based on soccer, but immersed in video game culture and reality TV antics. In Barcelona, Spain, the second season of the Kings League kicks off the first weekend of May alongside the first season of the Queens League. This summer, the Prince Cup will launch for kids ages 9 to 11.

fred and walk in the house music
FOUR SEASONS OF SOULFUL VOL.13

fred and walk in the house music

Play Episode Listen Later Apr 22, 2023 56:05


Bob Sinclar , Quinze - never knew love like this before (NEW) Kings of Tomorrow , Lorenzo Marcillas , Amber Liekhus - it's not over (NEW) Ron Carroll , SwayLo - something beautiful - Michael Gray remix (NEW) Beat Rivals , Hannah Khemoh - The Fire - Boon remix (NEW) Groove Junkies , B. Valentine - Lovin' you - Groove junkies & Deep Soul Syndicate Main mix (NEW) Marcelo Vak - I'll be Good - ThomChris remix (NEW) Sonic Soul Orchestra , Kathy Brown - touch me - Block & Crown Extd club mix (NEW) Shanw Christopher - another sleepless night - Wayne Soul Avengerz & Odyssey Inc. remix (NEW) DEUS - reaching for the Sky - Classic mix

Chop Soccer
New Kings League team in Brazil to be led by Neymar

Chop Soccer

Play Episode Listen Later Apr 4, 2023 8:05


The Kings League Final Four matchup, the largest in-person event ever hosted by a Twitch streamer, culminated with multiple shocking revelations that stunned fans and compelled them to keep watching the show in the true style. The biggest shock came from something Piqué had been hinting at for days: Neymar, a former teammate of his who is now playing for Paris Saint-Germain F.C., was joining the Kings League. Rox and Ken talk how Kings League is changing the landscape and if it would ever flourish in the USA.

Locked On Kings - Daily Podcast On The Los Angeles Kings
The Kings goalie situation looks solid. Who should be the #1?

Locked On Kings - Daily Podcast On The Los Angeles Kings

Play Episode Listen Later Mar 6, 2023 28:09


The Kings have won 3 in a row as they get set to host the Washington Capitals. New Kings goalie Joonas Korpisalo won his LA debut over the weekend. I'll tell you how he looked and what the goalie rotation should be the rest of the season with him and Pheonix Copley. Former Kings goalie Jonathan Quick made his Vegas debut, I watched it. I tell you want I saw. All that and more on this edition of Locked on LA Kings.Support Us By Supporting Our Sponsors!Built BarBuilt Bar is a protein bar that tastes like a candy bar. Go to builtbar.com and use promo code “LOCKEDON15,” and you'll get 15% off your next order.Athletic GreensTo make it easy, Athletic Greens is going to give you a FREE 1 year supply of immune-supporting Vitamin D AND 5 FREE travel packs with your first purchase. All you have to do is visit athleticgreens.com/NHLNETWORK Learn more about your ad choices. Visit podcastchoices.com/adchoices

Locked On Kings - Daily Podcast On The Los Angeles Kings
The Kings goalie situation looks solid. Who should be the #1?

Locked On Kings - Daily Podcast On The Los Angeles Kings

Play Episode Listen Later Mar 6, 2023 31:54


The Kings have won 3 in a row as they get set to host the Washington Capitals. New Kings goalie Joonas Korpisalo won his LA debut over the weekend. I'll tell you how he looked and what the goalie rotation should be the rest of the season with him and Pheonix Copley. Former Kings goalie Jonathan Quick made his Vegas debut, I watched it. I tell you want I saw. All that and more on this edition of Locked on LA Kings. Support Us By Supporting Our Sponsors! Built Bar Built Bar is a protein bar that tastes like a candy bar. Go to builtbar.com and use promo code “LOCKEDON15,” and you'll get 15% off your next order. Athletic Greens To make it easy, Athletic Greens is going to give you a FREE 1 year supply of immune-supporting Vitamin D AND 5 FREE travel packs with your first purchase. All you have to do is visit athleticgreens.com/NHLNETWORK Learn more about your ad choices. Visit podcastchoices.com/adchoices

From the Backseat: A Sports Podcast
SD Padres are the New Kings of the MLB West and a Draft of Super Bowl Halftime Shows

From the Backseat: A Sports Podcast

Play Episode Listen Later Mar 1, 2023 65:51


We are declaring ourselves a Padres Podcast and the Dodger Tears can come get us. The Padres have signed Manny Machado to a new contract cementing them as an organization who will pay until they win. San Diego has officially overtaken the Los Angeles Media Market. Check out our Merch Store: https://www.teepublic.com/user/from-t... Thank you Seatgeek for 20 dollars off your first order use the code: fromthebackseat Use this link to find all our socials: https://withkoji.com/@fromthebackseat Instagram - https://www.instagram.com/backseat_pod/Tiktok - https://www.tiktok.com/@fromthebackseatTwitter - https://twitter.com/backseat_podChalkboard - https://links.chalkboard.io/join-boar... #MLB #Padres #Dodgers Thank you for listening to the From the Backseat! Make sure to rate and leave a comment the Podcast if you liked what you heard. We are shouting outing anyone who leaves a review plus I will like you in the description.

Transforming Cities
Andrew Katz of Katz Development

Transforming Cities

Play Episode Listen Later Feb 15, 2023 57:37


On this episode I'm speaking with Andrew Katz, Founder of Katz Development. Since 2015, Andrew has participated in the development of approximately $350M of multifamily, office, retail, and hospitality projects totaling roughly 1M square feet in Colorado and Ohio. Since moving to Denver, Andrew has focused on infill development in and around the urban core with local developer, Westfield Company. Notable projects include: The Mission Ballroom, a 4,000-capacity new-build music venue operated by AEG North Wynkoop, a 14-acre mixed-use master development comprised of new-build office, multifamily, and hospitality; anchored by the Mission Ballroom Sustainability Park: a 100-unit condo and retail project anchored by renowned Japanese restaurant Uchi The 150,000 square foot adaptive re-use Stanley Marketplace Andrew grew up in Cincinnati, OH where he graduated from The Ohio State University with a bachelor's degree in Real Estate. Shortly after, he moved to Denver, Colorado to pursue a career in real estate development. He founded Katz Development alongside his father, Scott Katz, who has owned and managed Cincinnati-based development and brokerage firm Midland Retail for 30 years, and his younger brother, Adam Katz. Related links for this episode: Katz Development - https://www.katz-dev.com/ Andrew on LinkedIn - https://www.linkedin.com/in/andrew-katz-467852ba/ Andrew on Twitter - https://twitter.com/katzdevelops Hotel Magdelana - https://hotelmagdalena.com/ Plate Fifteen - http://www.plattefifteen.com/ Ascent MKE - https://www.ascentmke.com/ T3 RiNo - https://www.t3rino.com/ PDX Airport - https://pdxnext.com/Stories/Details/new-pdx-roof-from-forest-to-frame The New Kings of New York (book) - https://amzn.to/3jyhAy4 Be sure to support this podcast by subscribing and reviewing! Visit Authentic Form & Function for more information: https://authenticff.com © 2023 Authentic Form & Function

Locked On Knicks - Daily Podcast On The New York Knicks
The New Kings Of New York: Jalen Brunson And Josh Hart Help The Knicks End The Nets Reign Of Terror

Locked On Knicks - Daily Podcast On The New York Knicks

Play Episode Listen Later Feb 14, 2023 32:41


The Nova Boys can hoop a little huh? Jalen Brunson and Josh Hart combine for a tidy 67 points as the New York Knicks blow out the Brooklyn Nets and put a nine game losing streak against their crosstown rivals to bed. Gavin Schall is joined by his former Locked On Nets co-host Josh Bass to discuss what the Knicks ceiling is if Brunson and Hart keep playing this well, assess RJ Barrett's struggles and explore what the future of the Nets looks like now.Follow & Subscribe on all Podcast platforms…

Locked On Knicks - Daily Podcast On The New York Knicks
The New Kings Of New York: Jalen Brunson And Josh Hart Help The Knicks End The Nets Reign Of Terror

Locked On Knicks - Daily Podcast On The New York Knicks

Play Episode Listen Later Feb 14, 2023 36:26


The Nova Boys can hoop a little huh? Jalen Brunson and Josh Hart combine for a tidy 67 points as the New York Knicks blow out the Brooklyn Nets and put a nine game losing streak against their crosstown rivals to bed. Gavin Schall is joined by his former Locked On Nets co-host Josh Bass to discuss what the Knicks ceiling is if Brunson and Hart keep playing this well, assess RJ Barrett's struggles and explore what the future of the Nets looks like now. Follow & Subscribe on all Podcast platforms…

Working Capital The Real Estate Podcast
Cities, Skyscrapers and Development with William Strange | EP135

Working Capital The Real Estate Podcast

Play Episode Listen Later Jan 17, 2023 39:57


William Strange is a Professor of Economic Analysis and Policy at the Rotman School. William is former Editor of the Journal of Urban Economics (with Stuart Rosenthal), and he served in 2011 as President of the American Real Estate and Urban Economics Association. He works in the areas of urban economics and real estate. His research is focused on agglomeration, industry clusters, labor market pooling, skills, private government, real estate development and real estate investment. In this episode we talked about: William's Background and how he got into Real Estate Rotman School Real Estate Program Paper Analysis of Skyscrapers Macroeconomic Outlook Urban Economics Resources Useful links: Book “Triumph of the City: How Our Greatest Invention Makes Us Richer, Smarter, Greener, Healthier, and Happier” by Edward Glaeser Book “The New Geography Of Jobs” by Enrico Moretti https://www.rotman.utoronto.ca/FacultyAndResearch/Faculty/FacultyBios/Strange.aspx Transcription: Jesse (0s): Welcome to the Working Capital Real Estate Podcast. My name's Jessica Galley, and on this show we discuss all things real estate with investors and experts in a variety of industries that impact real estate. Whether you're looking at your first investment or raising your first fund, join me and let's build that portfolio one square foot at a time. Ladies and gentlemen, my name's Jesse for Galley, and you're listening to Working Capital, the Real Estate Podcast. My guest today is William Strange. Will is a professor of economic analysis and policy at the Rotman School that's at the University of Toronto.   He's the former editor of the Journal of Urban Economics, and he served in 2011 as president of the American Real Estate and Urban Economics Association. He works in the area of urban economics and real estate. His research has focused on industry clusters, labor market, pooling skills, private government, real estate development, and real estate investment. Will, thanks for being here. How's it going?   William (58s): Thanks a lot for having me, Jesse. It's going great.   Jesse (1m 1s): Well, I appreciate you coming on. Like we said before the show, I thought there's a couple different areas of research that I thought we could jump into and, and I think the listeners would get a lot out of. But before we do that, why don't we kind of circle back to you in, in your current role at the University of Toronto and kind of what you're working on today, how did that all come to fruition? How did you get into, into this business of real estate?   William (1m 25s): Well, I got into real estate as an urban economist, so when I went to graduate school, my favorite undergraduate econ class was urban. I liked it because there are so many things going on in cities. Cities are just interesting organisms. And so I, I pursued a PhD at Princeton with Ed Mills, who is the father of the feet, modern field of urban economics. That ended up with me at U B C amongst the real estate folks. And I gradually came to understand just how interesting real estate is too, and just how much an urban economist will have to say about real estate, you know, both on the residential and commercial side.   I feel incredibly fortunate that I've lucked into a, a career as satisfying as this one has been.   Jesse (2m 8s): That's great. And the current role that you have at Rotman, so for people that aren't, aren't familiar, that's the, the business school at the University of Toronto. The, the teaching that you do there, is it predominantly undergrad is,   William (2m 21s): It's almost entirely MBA and PhD. I teach some vanilla economics, which I think is important too. Yeah. But, but we also teach a bunch of econ cla a bunch of real estate econ and real estate finance classes. One thing that I would say to your audience is I'm also the director of the Center of Real Estate at Rotman, and we periodically put on public events, we put on one on downtown recovery back in December that was addressing the different pace at which downtowns were repopulating as Covid fingers crossed, recedes.   And, and we were scheduled to do a housing market one with City Post in March, and we'll keep doing them as interesting policy issues emerge. We are, we, we welcome people from outside Rotman. Please come everybody.   Jesse (3m 12s): Yeah, that's great. The, and we want to jump into one of the papers that you did, you did regarding covid. Before we do that though, I'm curious, you know, people in our industry, when we think of schools that have a real estate program at the MBA or or higher level, you know, whether it's economics or finance or real estate, I think of, you know, Rotman, I think of Osgood. A lot of people have gone to Columbia and New York for their Ms. Red program. Has that, how long has that program been the real estate specific aspect of it? How long has that been something that has been at Rotman?   Because I, I feel like you guys were one of the first to actually have the, that specialization.   William (3m 48s): It's nice of you to say, but it was, it started building up when I came in 2001 and we've specifically p positioned ourselves to not duplicate other programs. Like I, I, I like the SCHOOK program very much, but there's no reason that we need to do something that's as specialized as their program is, given that they already have such a program that's, that's a good program. So what we have done is to set up a smaller real estate program. We have three electives of the 10 classes and MBA would take with the idea being that people in real estate benefit from taking things outside of real estate, you know, that a good real estate person needs to know about finance, a good real estate person needs to know about strategy and my various colleagues in Rotman can help in those ways very much.   Jesse (4m 33s): Yeah, no, that makes sense. So before we, we jumped on here, we, we talked about a paper that kind of pid my interest and it was just being in the commercial real estate world and it was a basically a, a paper analysis of skyscrapers. I thought before we jump into this Covid paper, we could talk a little bit about this, this paper that you did regarding skyscrapers.   William (4m 53s): The skyscraper paper is still pretty relevant. I mean, what it's motivated by is that we're living in a new era of skyscrapers that if you look at something online like the skyscraper page, you can see the big buildings that people are planning to build. The Empire State Building was the biggest building in the world for on the order of 40 years before the World Trade Center. It has since been sub topped by Burge Dubai. And there are other buildings that are, are also really large that are either recent or, or that are being planned.   The big question is, are these big buildings being built big because it's economical to do so? Or are they being built big for some other reason? You know, possibly ego reasons, possibly other stuff. And so we have analyzed skys, this is in my paper with Bob Helsley from UBC. In this paper we look at skyscrapers as a contest for who is the biggest, this, this is assuming that people want to be bigger than the other person. Let me give you a couple of historical examples of that.   I mean, people did look at whether h skyscrapers were economical in the 1930s after the big skyscraper wave of the twenties and thirties. That was mo allowed by things like structural steel and elevators. And we see there a lot of stuff that looks game theoretical. So one story is the story of the lower man of the Manhattan Company building, which is now Trump's lower Manhattan building. And, and, and the incredibly beautiful art deco Chrysler building.   And they were each built to be the biggest building in the world at the time. Manhattan Company building finishes first, so it has a ceiling on it, and they are very happy because the ceiling on the sky on the Chrysler Building is, is gonna be lower. So for some reason, the Chrysler building did not build an extra a hundred feet that would've made them bigger than the Manhattan Company building. And, and this has an added issue of personal interest, that the lead people on both of those projects hated each other. They used to be partners. There was a breakup of their partnership and, and not the owners of the buildings, but the architects despised each other.   Unbeknownst to the people who built the Manhattan Company Building with the Chrysler Tower, the most famous thing about it, if, if the readers Google it right now, you'll see it is the spire at the top. It was hidden inside the structure, so people didn't know what happened. And so they waited until the Manhattan Company building had reached its ceiling and then they raised like a giant middle finger, the spire of, of the Chrysler building, which made it an extra 50 feet taller than the Manhattan Company building. It's really hard to argue that there is some economic tenants paying rent sort of argument that would make you do something like that.   That's one example. Another example is the Empire State Building, which I mean we've all seen King Kong bu movies, so we know how the Empire State Building looks, but, but the, you may not know that the spire on top of the Empire State Building, which made it by a couple hundred feet bigger than the Chrysler Building when it was built, that was originally pretended to be a Zeppelin loading dock. So people would be taking international flights by blimp and, and on top of Manhattan where winds are pretty big, they, they would tie the Zeppelin on and then people would get off on on it.   No one ever did that. That was just totally a fiction to allow the building to be as big as it could possibly be. So in, in, in this paper, we look at that as what is called in game theory and all pay auction. That's an auction where you have to pay, even if you don't win in, in this case, you pay to build the building even if you don't win the race of having the very biggest building subsequent to our paper, which was theoretical. Others have looked in various ways for empirical evidence in the data, and there seems to be a lot of it around the moral of the story being some of these big buildings look like they should be built based on economics, or at least you can make a justification of building such a big building on economic grounds.   But there's a lot of evidence that people wanna build a little bit bigger than the other guy, even if it's not economical because of the prestige that seems to go with being the biggest building in a market or in the world or of a particular type. If you look online, you'll see all kinds of lists of, you know, biggest office building, biggest residential building, biggest building in Canada, biggest building in Toronto. It seems to be something that people do care about and not simply just the economics of, of building real estate space for tenants to use.   Jesse (9m 29s): Yeah, that's a fascinating story. I'm almost embarrassed to say I I had never heard of that. So they continued to build with regard to the Manhattan Chrysler, they continued to build hiding the spire within, within the   William (9m 41s): Envelope, within the structure because the seal structure, you know, you can have it own. And then they literally leveled it up. There's a, I forget who wrote it, but there was a book, there's a book on this whole episode, which I think is a fascinating story. Yeah.   Jesse (9m 51s): Oh, that's great. Yeah, that it's, it's interesting too, I'm reading a book right now that New Kings of New York by The Real Deal, and it talks about a lot about kind of the Trump era of New York when it was the, the basically push to build more and more price per square foot condos, high-end condos. And it was really almost a race of who could build the best, the the tallest. And it became a lot of, seemed to be a lot about ego rather than economics.   William (10m 16s): Yeah, I mean, I think ego matters in real estate. Look, I mean, I I'm just a professor, I just write papers. Somebody who actually builds tall buildings can, you know, look at this thing that they've built and I understand why people's personalities are invested in it and why, you know, they wanna build buildings that are deemed to be significant. I mean, for a long time the, the CN Tower was the biggest structure in the world, and people make a distinction between occupied buildings and unoccupied structures. And so, you know, clearly we in Toronto are, are not immune to building buildings for ego-based reasons.   Jesse (10m 51s): And it was there a distinction in your research between commercial skyscrapers as opposed to residential towers? Or, or was it,   William (10m 59s): I mean, the early ones were, were all commercial and, and well, I mean the Eiffel Tower shows people how structural steel lets you build stuff that's big and then the Woolworth building becomes the biggest building in the world. And then as supplanted, as I said a little while ago, briefly by the Manhattan Company building the, whatever the Trump building is in lower Manhattan and, and Chrysler, they were commercial. But now, now we see people building big residential buildings. I mean, it, it can be problematic. The, the, the former Sears Tower, and I'm having a brain cramp now about its current name, Willis Tower.   I believe it, it was renamed a while ago. It had a problem after its initial construction because it was big enough that the building swayed in the wind and, and this made people feel very uncomfortable. And so there was a period of time and it, it could continue. I'm not sure whether it is or the tallest, the, the, the highest suites in that building were used for storage because people didn't wanna be up there because it wiggled around too much. Yeah. And, and, and just made them uncomfortable for residential.   I mean, I don't know what your experience is, but I have a friend who was on the 40th floor of a Toronto building and which, you know, he thought was beautiful, gave him a view of the lake and so on and so forth. But during covid when you don't wanna be in the elevator with a lot of people or worse still, if the elevator is slower is not running, you know, 40 stories is a long ways to walk.   Jesse (12m 24s): Yeah, absolutely. Well the one with the Willis Towers kind of, that'd be Chicago too, so I I'm sure it, it, it'd get pretty windy up there. I think for us, if, if I'm not mistaken today, our first Canadian place, at least in the Toronto area.   William (12m 38s): Yeah. Ever since it's been built, that's been the biggest building in Canada and it's, it's of course commercial. Yeah. There are some things that I believe people are considering that might be bigger but haven't been built yet.   Jesse (12m 48s): So you, you mentioned something that you ask your class at Rotman question that I, right before we got on this call, I would, I would've failed and can pose the question to, to listeners that you normally ask your class at Rotman.   William (13m 2s): Well the, I mean, I I've said that this is an era of skyscraper construction and I've talked about the earlier one. And the question is what is it that it took for us to have skyscrapers? And it turns out there are two things that it took. It took structural steel and it took elevators. And before I ask the question, I can give you the elevator story because that is also one that's worth hearing. Sure. Elevators are old. They're like, they're like, Archimedes figured out how you could use pulleys to lift things. The problem with a, a classical elevator is if the cable was cut, the elevator would fall and whatever was on it, including humans would be destroyed.   And, and, and thus elevators were not used, you know, for large distances for human beings because it was just considered to be too dangerous. The name that most people will associate with elevators is Otis. And, and Otis went to the New York World's Fair in, I believe 1856, give or take two years. And he demonstrated his safety elevator. And the way he did it was he was pulled up in the elevator with a very sharp sword in his hand to about 40 feet with an audience watching him. And then he cut the cable above the, the rope that was on the elevator above himself and the audience went, Ooh, because the, they, they were sure that he was now going to fall to his death.   But the Otis elevator's innovation was, it didn't fall, it was a safety elevator and it had automatic brakes that would arrest it. Before that you wouldn't see apartment buildings that were any bigger than six stories. Cuz you know, six stories is a lot to walk up. You wouldn't wanna walk up 10. But now once you have elevators, vertical distance is not a barrier anymore. And that really changes the ability, the demand for big buildings on the supply side. This is my question, what was the biggest building in the world in 1850 around when the elevator was developed and before skyscrapers were, were started to be built?   So I'll leave leave you a minute to think about it. Look it up on Wikipedia or, or whatever the answer is that the biggest building in the world was the great pyramid from something like 1400 bc. Why is that worth mentioning? Because it's a masonry building and, and the key feature of masonry buildings is that the supporting walls on the lower floors have to get bigger and bigger as the building gets taller in or in order to bear the weight to say, to say nothing of earthquakes and other problems with masonry buildings, structural steel changes that structural steel lets you go up.   I mean it's, it's incredibly robust. We don't always use structural steel. Now the World Trade Center did not to, to its peril. It used much lighter framing. And that was one of the things that meant that the intense heat that the airplanes produced when they hit the building were able to bring it down. That's a worthwhile story to to point out because the Empire State Building was also hit by an airplane during World War ii, which people might not know about because the Empire State Building is still there. Yeah. It was foggy and a, a World War II bomber crashed into it, but because it was structural steel, it basically bounced off.   I mean, it was, was not good for the airplane and not good for the pilots, but it, it survived. But we've learned cheaper ways to build buildings subsequent to that without structural steel. And that seems to be one of the factors that's responsible for the skyscraper wave that we have seen in, in recent years with Birds Dubai. Now the tallest building in the world for a while, Taipei 1 0 1 was, was the biggest building in the world. You have very tall buildings being built in, in many Chinese cities, especially Shanghai.   People are building big buildings, you know, and, and part of it is the strategic thing that we talked about a minute ago in the case of Taiwan. I mean, if you read about that building, it's clear that this was a matter of great national pride. And so the Chinese were building it to make Taipei obvious as an important business city and to make, to make Taiwan an an important place. The same sort of thing in places like Birds Dubai, I mean, what will be the financial center in the Middle East, it's, it's not obvious what it would be having big buildings, you know, they're hoping that if they build it, people will come.   Jesse (17m 10s): Hmm. Yeah. That's fascinating. Well it was good to, good to jump on that cuz that paper I saw that the title and I was like, well it's got economics, it's got skyscrapers. So just being from the commercial real estate side of things, I thought it'd be something listeners get some value out of. Well, I   William (17m 24s): Mean, so for, for your readers who are in the industry, I mean, it's a valid question for folks to ask. Do the economics justify such big buildings? I mean, in, in a lot of cases they do. People were convinced that the, say the Empire State Building did, of course the Great Depression happened begin after the Empire State Building was started and before it was finished. And so the Empire State Building was financially rather a disaster. It was called the Empty State building for about the first 10 years because they had so much trouble tenanting it up.   And so this is something that market participants should ask themselves. Does the market support a big building or is there something else that's going on with the building's size?   Jesse (18m 2s): Yeah, well we're certainly going through a, you know, a different version of that in terms of some of the construction or or over construction in some of our major cities. And just trying to see if the, if the lease ups will, will actually, if the absorption will be able to fill those buildings.   William (18m 18s): Right. I mean, we had buildings that were designed pre covid and that came on the market in 2022 and are partly responsible for the slow absorption that we've seen in recent years. I mean that's a, a very valid point. I mean, a lot of my other research has dealt with the fundamentals of why people want to concentrate spatially. Hmm. So, I mean, in Canada, a huge amount of our population is in the three cities of Vancouver, Montreal and, and Toronto. Yeah. In, in the case of the US when people use satellite data to look at how much of the country is actually occupied.   So you're looking at data that reflects down on the land and the satellite can tell you, is this dirt or is this concrete? The US is a big country, 2% of it is developed. I suspect the number would be even smaller in Canada. But I haven't seen somebody use satellites to do that. So we have this situation when Toronto and Vancouver at least are incredibly expensive when households say that affordability is the biggest issue that they face economically, not just real estate, it's the biggest issue that they face.   And yet everybody keeps piling into Toronto no matter how expensive it is. And thus prices continue to go up and up. I mean, I think one of the silver linings we may see from Covid is, is that through Covid we have learned that remote work is possible, can't do everything remotely that you can do in person, but you can do a lot. And that to the extent that Covid allows people to do things remotely, you know, either at different places in the same city or even in different in in, in different cities completely.   That may make it less essential for everybody to be down at bay in Adelaide, you know, paying the high rents that people pay down there and thus paying the high housing prices that you have to pay to be close to bay in Adelaide for your job as an investment banker, you know, this is a possibility to un unlock value for folks by freeing them from the Toronto housing price death spiral that people have been dealing with for so many years.   Jesse (20m 19s): Yeah. And we're, and we're dealing with, so we have 84 offices predominantly in, in North America, but we are a global company. And it's one thing where you are taking a b class or a suburban office and converting it to industrial or residential. It's, it's another thing to have these massive towers in cities and just trying to figure out how we repurpose the space, whether, you know, and   William (20m 39s): People are sure talking about that and there's, there's certainly fortunes to be made in people who feel how to figure out how to do it. Right. But I mean, what I'm hearing, and I'm, I'm nobody's architect, but what, what I'm hearing is the challenge of the seven and a half foot ceilings that you might see in an office in a residential setting are really problematic. And you can make a lot of internal changes in the building, but dealing with the floors is, is hard.   Jesse (21m 1s): Yeah, absolutely. And I think some of what you just mentioned here touched on, I noticed another paper on, on your, on your link on U F T or on Rotman's website was entrepreneurship in cities. And, and I imagine that kind of ties into what you're, what you're talking about here, it's that question of why do we congregate in these   William (21m 18s): Metropolis that, that there's something in downtown Toronto that people are willing to pay for. The market tells us that this is valuable. Both the housing market and the commercial real estate market say that Toronto's expensive people aren't throwing away the money for no reason they're paying it because it's a good, good value. As expensive as it might be. I mean, I like my job in Toronto, thus I'm willing to pay a whole bunch of money for a house here cuz I have to live here in or in order to be able to teach in, in, in the Rotmans school. So that, and a whole bunch of other things.   But, but ever since the dawn of the internet, some people have been arguing that distance is dead. And and I think that's wrong. Distance isn't dead. Maybe it smells funny, but it isn't dead yet. And in, in thinking about Covid, there was a New York Times op-ed that Jerry Seinfeld wrote titled New York City Is Not Dead. He wrote this in response to a friend of his, a fellow who owned a comedy club arguing that New York City was dead. And in this case, I'm happy to say that I agree with Jerry that that places like New York and Toronto are for sure challenged by, by things that happen associated with C O V D.   You know, two years ago what we were worried about is making each other sick. We are less worried about that as the disease has become less virulent as we and as we become vaccinated. But you know, hopefully, you know, COVID is killing 500 Americans a day. I don't know how many Canadians it's killed killing a day. Are we are much healthier than America is in that particular regard. But in, in addition to that being a challenge for folks, the working from home phenomenon is almost certainly here to stay.   It's just incredibly valuable for people to stay home and write reports for a day instead of fighting traffic to drive 45 that's from North York downtown, and then do the same thing again in the afternoon. So anyway, Jerry's friend wrote an article saying New York was dead. You know, that that that the value of being close to other people was, was really being challenged. Seinfeld said, no, it wasn't. We did some work using contemporaneous data. So the only time in my life I've used absolutely fresh data off the process and I I now have more patience with other professionals who use that, who use that kind of data.   It's just a lot harder to do stuff with that. And we looked at something called the commercial rent gradient. So the commercial rent gradient is telling you how much rents are declining as you, you're moving away from, from the city center. And so, so in Toronto, rents are highest in the city center. They go down as they move away, they rise in suburban sub-centers. We were not able to get good Toronto data to do these calculations here, but we did do it in cities that are like Toronto in the us like New York and Toronto and in and in cities like that, the gradient might be 6%.   So my, my co-authors were American, so they made me do this with miles, but the result was rents are declining by roughly 6% a mile as you move away from the center of activity in the city. If, if the big cities are dead, you know, given the long term nature of commercial leases, we should see people demanding large discounts when they're signing up in the downtown or, or close to the downtown, not paying the premiums they previously paid with the onset of covid and work from home and stuff like that.   What we found was a little of that, but not a lot of it. What we found was that the gradient went down by about a sixth. It went down from about 6% to about 5%, but it's still a gradient. People are still signing leases in 2021 to pay a big premium to be downtown, which is suggesting that, you know, as mu as much fun as Zoom can be and as productive as Zoom can be, it's not the same thing as sitting next to the other person and, and hearing them talk with their clients and realizing there's some synergy with what you're after and what they're after, which is the kind of thing that people are paying big dollars to locate downtown and getting.   So our answer is so far the downtown is less attractive, but is still attractive in, in core dominated cities like Toronto. Now can I tell you that it's gonna be that way five years from now? Of course I can't And and we do promise I'm saying this to someone who will broadcast it. So I guess this promise has some credibility. We promise that once, I mean our intention was once Covid is behind us, do this again. We are realizing that Covid will not be behind us and we'll have to pick another time to do it again and see what the evolution of this is.   But thus far we're still seeing people attracted to large cities. One scenario would be that this is a continuation of a phenomenon that Toronto saw in the late eighties and the nineties when back office stuff got moved out of Toronto to Mississauga and then later to places that are farther away than Mississauga. You know, people thought, oh no, the downtown's going away. No. What we were doing was we were keeping only the people downtown who really need to be there, the people who really need to be there to interact with other folks, you know, that that's what really matters and not the fact that the physical files are located in the building there.   Yeah. So this may be the same kind of thing where downtown Toronto just becomes more and more rarefied. Yeah. You know, that the investment bankers stay there, but maybe not the middle managers now that, that that is a social issue that we have to engage with, you know, if Toronto just becomes a city of investment bankers and Uber drivers. Yeah. You know, which is sort of the story that I'm telling you. Yeah. But at least that evidence and that theory points us in the direction of that being someplace we could end up.   Jesse (27m 4s): Yeah, no, for sure. And I think for the, you know, kind of the anecdotal side of things, what we see on the street is we see leases being signed. We see that there is a bit of a spread between the bid ask, but it, but it's not at the discount, which we, you know, I have clients they call me and Yeah, especially in the middle, at the beginning and in the middle of Covid, they're expecting these 20%, 30% discounts, you know, on pricing and for leasing and they just weren't happening. Landlords were providing inducements, whether it was free rent allowances. But even today, we, we still see these leases being signed and if anything, the trend that I've seen with most of the clients in the downtown areas, whether it's New York, Boston, Toronto, is that there's a, you know, the term flight to quality gets thrown around a lot.   We're seeing a lot more of that. And we're seeing, I agree completely, we're seeing even four years ago where a startup might want to be in a trendy area in, in the periphery of Toronto or of New York, and we're starting to see more of them have transit as a component. Not that it wasn't important before, but it's, we're seeing that almost pretty much at the top of the list for these, for these tenants.   William (28m 5s): Yep. Transit matters and, and the businesses are deciding they wanna be where the accountants and the business lawyers and the, the bankers are, you know, because they need to interact with them all the time. So I mean, the flight quality, I've heard noises in that direction also that what we would see would be, look, people have been talking about the retail apocalypse for years about online shopping, cannibalizing brick and mortar retailing. Now, did that kill the Eaton Center? It didn't because the Eaton center's in a market position where people are still willing to go there, but it's gonna kill someone.   I've got,   Jesse (28m 37s): I've gotta go there today. There's   William (28m 39s): Good for you. I'm glad one of my predictions ends up being true. Yeah. But, but credit old, old, old fashioned malls, they're getting torn down and, and getting replaced with something different. And I think we could imagine that being something that would happen too. I mean, just something that the audience should think about more generally is that the way the downtown has been for the last 10 years is different than it was 30 years ago, you know, when you had back offices there and it's way different than it was a hundred years ago when there was still a lot of manufacturing activity in the downtown, taking advantage of the proximity to the lake and to shipping and stuff like that.   And so the notion that the downtown should be frozen in Amber as of 2000 or something like that is crazy. It's never been that way. It's gonna change as business changes. And that's a good thing. I mean, that's, that's a way that the ability of Toronto to deliver good, good jobs and high value business outcomes is crucial for all of Canada. And, you know, anything that we can do to make Toronto a better competitor to New York, Boston, and San Francisco very much, much serves Canada's interests.   Jesse (29m 42s): Absolutely. So I wanna be mindful of the time here, will, but I do wanna get to your, your paper, your, I I'm not sure if it's your most recent paper, the one on Covid, but maybe you could give us the   William (29m 54s): Covid one was the one I just talked about a second   Jesse (29m 56s): Ago. Okay. So, so in, in, so what, what was the ultimate thesis of that? Was it this, this divide that we're seeing as, I would say even kind of an inequality of a potential outcome of having downtown cores be predominantly bankers? Or was that, was that the, the other paper,   William (30m 13s): The focus was on whether downtown would still be as important as it used to be. And we looked at, I, I left out some of the results. The, in addition to looking at core dominated cities like Toronto, we also looked at much more spread out car oriented cities like LA and Dallas and stuff like that. And the pattern in, in those places was different. In those cases, the gradient was already smaller. It was, you know, two or 2% rather than the 6%. And it didn't change a lot after Covid, you know, because la the downtown is, is different than the rest of the city.   But LA is not a downtown dominated city the way that Toronto is at all. And Covid didn't affect those. We looked at some parallel results that weren't as parametric, if you'll forgive my geekiness, the gradient puts an exponential functional form to get a percentage decline from the downtown. But look, I mean, how, how are we to think about sub-centers in North York and Mississauga and Markham and places like that in, in, in relative to having one downtown at Bay and Adelaide.   So we also looked at the premium that tenants pay to be in a high density environment. So that's a, a more flexible, functional form. We basically got the same results, which is the value of density does get smaller just like the gradient gets smaller. But it by no means goes all the way to zero. Cities aren't dead yet. Now the changes are just starting and things may change a lot. We may finally, eventually end up in a circumstance where distance really is dead the way people have been saying it would be since the early nineties.   But we're certainly not seeing it yet. And, you know, looking at real estate markets is one way to understand that, you know, because people put it, put their, you know, people can talk about distance being dead, but that's just talk, I mean a tenant paying, putting down a guarantee on, on real estate lease that's putting their money where their mouths are and how much money they're willing to pay for the downtown versus someplace extra or for a dense non downtown location like Mississauga Center of Mississauga relative to somewhere more peripheral.   You know, what we're seeing is people are still willing to pay premiums for those things. This could change, but it did not change in the early years of covid. And you're telling me that your sources say that it's not changing right now yet either. So I think that's where we are as of this minute. Will it change, you know, who knows?   Jesse (32m 39s): Yeah, it's a very, it's kind of a fascinating time in the sense that it's, it's hard to get data points when we're, you know, fingers crossed coming out of Covid, but potentially entering a recessionary environment. So it's, you know, we're, we're positive in one, but then we're drawn back in another. And I'd be re remiss if I didn't ask, if I was speaking to economists and didn't ask a little bit about the kind of macroeconomic environment.   William (33m 2s): I'm not a macro economist, so I'll probably avoid, but by all means you can ask.   Jesse (33m 6s): But, but yeah, I mean, how do you see this? Or if you do at all as a, as a comparison to oh eight or oh one or the early nineties and, and, you know, we, we come out of something that was extraordinary, the pandemic, but now we're entering inflation numbers that we haven't seen in, in years.   William (33m 26s): I, I think it, it, it is absolutely to be worried about because inflation, as, as economists who know more about the stuff than I do have always said it, it reduces the information, content and prices reduces the incentives that price systems have. So it just makes capitalism work less well than it would have previously. So it's certainly a risk. I will say that the government's decision to stimulate the economy during covid kept us from having a recession. I, I mean, I don't know if you recall, but in May of 2020, the C M H C who know a lot about housing more, more than I know about housing, they, their projection said that they predicted housing prices would fall.   I think the number was 18% in, in the preferred model that they offered. Now, I didn't have a model, but that was my inclination also, and also my inclination of the colleagues that, you know, housing is a normal good. People buy more of it when they're rich and, and there, there it seemed closing people out of their workplaces is surely recessionary. So I I I told my neighbor who I like and respect, you know, I I think you should, if you're thinking about selling your house the next few years are, are problematic. I, I was wrong.   I mean, the PR prices went up by more than 30% in Toronto. Quality adjusted during that, you know, in, in part because the government tried to keep people from being killed. But now they've spent huge amounts of money and they can't spend like that forever. And economies don't stay in boom, forever, ever either. So there, you know, there there is uncertainty and, and there is risk.   Jesse (34m 60s): Yeah. Well, I guess, we'll nobody has a crystal ball here for this next year.   William (35m 4s): Especially not Microeconomists and, and people who spent a lot of their careers doing theorists doing   Jesse (35m 9s): Theater. No, I, I, I wouldn't I once sell yourself short. I feel like a lot of the insights come from, from the micro and, and get extrapolated. Well,   William (35m 16s): I, I, unlike micro, I just believe in, I mean, economist, I believe in the division of labor and there are other people who know more about macro than I do.   Jesse (35m 23s): Yeah. So Will, we're, we're gonna wrap up here. What I'd like to do is, first of all, for those that want to kind of learn more on, you know, urban, urban economics, urban planning seems to be a, a passion of yours. But just generally speaking, are there books or resources that you've used in the past that you think would be good recommendations for listeners if this is something they're interested   William (35m 43s): In? Yeah, there, there are a couple of them. And, and I'm, I'm giving you civilian friendly books Okay. That you could read to pass the time on an airplane and not, not a boring textbook. The two examples that come to mind immediately are a book called Triumph of the City by a guy at Harvard called Ed Glazer and another book called New Economic Geography by a guy at Berkeley called Enrico Moretti. They are both lucid explanations of the kinds of forces that we've been talking about. Now both of them are a little less real estate than our discussion has been, but they are about forces that feed into real estate markets.   I mean, someone who's a market participant has to be asking themselves why are people paying the premiums for the downtown? Will they continue to pay the premiums from the downtown? And, and if not, how can I trade on that perce perception? I mean, because there are clearly gonna be places where people who get priced out by Toronto go and those real estate markets are gon are, are, are going to be booms. I mean, I don't think people are gonna go to Vancouver to be cheap, although maybe they will go to Vancouver for warmer winter weather.   A question that I think is, is unsettled as of this moment is, do people who get priced out of Toronto go to someplace close to Toronto like Hamilton? You know, so you can drive in for a Wednesday meeting, but it's cheaper than Toronto is, or do you go somewhere or do you go to someplace like Montreal that is farther but is cheap for a big city? Or do you think about somewhere that's even farther still and, and, and cheaper still like Halifax. I mean the Maritimes are wonderful place a whole lot cheaper than Toronto.   And if a huge amount of your work is Zoom meetings, you know, for some people that location is, is gonna be the more economical place to   Jesse (37m 25s): Be. Yeah, that's, that's interesting. So I've, I've read Ed Glazer's book, I've, I have not read the New Economic Geography. So that definitely put on the reading list for those. Just interested in, in kind of your research will or the Rotman program in general, what, what's the best place to send? And we'll put a link in the show notes.   William (37m 46s): I mean, look, people can email me and I will either respond or not, depending on how many thousands of emails that I get. I mean, for admission to the programs, you know, we are recruiting students every year. I think our, our MBA program is fantastic. We have programs that work at the full-time level and get done faster, but we also have part-time programs that get done that, that work better for professionals. And I actually think there's a, the case for the part-time programs have become stronger in recent years because there's gonna be a lot more times when somebody can meet a professor in office hours on Zoom rather than having to schlep up to the Rotman school af after work.   But, but also we, we have these public events and googling Rotman events. I, I don't know what the le the link would be, but Googling Rotman events is gonna put you in touch with real estate things. But a lot of other things would be useful and we, we try to be good citizens. We're physically close to the center of business in Canada. It's what five subway stops or so to get up here. You know, we want people in the building and now that the building is open, I think people would find it a good use of their time to show up for some of the things that happen here.   I would also give a shout out to the New School of Cities that was formed separately of us at the University of Toronto. This attempts to include the stuff from my world on econ and real estate, but also architects and planning and things like that that also relate to cities. It is the first of its kind in the world, has a fantastic director and I think we'll do very cool things in time.   Jesse (39m 21s): My guest today has been Will Strange, will, thanks for being part of Working Capital.   William (39m 25s): Thank you very much.   Jesse (39m 36s): You so much for listening to Working Capital, the Real Estate podcast. I'm your host, Jesse for Galley. If you like the episode, head on to iTunes and leave us a five star review and share on social media. It really helps us out. If you have any questions, feel free to reach out to me on Instagram. Jesse for galley, F R A G A L E. Have a good one. Take care.

Getcha Popcorn Ready with T.O. and Hatch
EP 114 - Guy Torry Gives His Top Picks for New Kings of Comedy

Getcha Popcorn Ready with T.O. and Hatch

Play Episode Listen Later Dec 22, 2022 52:00


T.O. and Hatch welcome Comedian and Actor, Guy Torry, to the show and discuss being in a funny family with his brother, Joe Torry, who the new Kings of Comedy should be, and how growing up in St. Louis he still became a Lakers fan and a Clippers hater. Stream for free on Hisense Smart TVs, LG Channels, Sports on Tubi, Plex, Samsung TV Plus, The Roku Channel, Vizio Channels fubosportsnetwork.com, and XUMO or as part of fuboTV's subscription packages of 100+ sports, news and entertainment channels. Learn more about your ad choices. Visit podcastchoices.com/adchoices

The Spoon
Ep 489: The New Kings Of Nederpop

The Spoon

Play Episode Listen Later Dec 16, 2022 75:06


This is The Spoon, where Dan Epstein is our guest, and we're doubled but funky!   Music By  Foghat  The Golden Earrings  Young-Holt Unlimited  Spoon Feeding  The Kitchn The Walking Dead (again!)  Rick And Morty  Inside Job   Dean Martin Celebrity Roast of Sinatra  The Men Of The SpoonRobbie Rist Chris Jackson Thom Bowers The Spoon on Twitter  The Spoon Facebook Group The Spoon Facebook Page Email: the_spoon_radio@yahoo.com

Sal Licata Podcast
The Yankees are the new kings of making excuses

Sal Licata Podcast

Play Episode Listen Later Nov 15, 2022 5:18


Hal Steinbrenner continued the Yankees' run of making excuses for why the Yankees keep coming up short and Sal says the Yankees are not even close to what they used to be.

Pickaxe and Roll
Are Zion and the Pelicans the new kings of the Southwest division?

Pickaxe and Roll

Play Episode Listen Later Aug 10, 2022 40:01


Ryan Blackburn breaks down the movers and shakers of the Southwest division, covering the Dallas Mavericks, Houston Rockets, Memphis Grizzlies, New Orleans Pelicans, and San Antonio Spurs. Which team will reign supreme next year? Which matchups will give the Nuggets the most trouble? What are the best bets for end-of-season awards? Ryan discusses it all.

The Deuce & Mo Podcast
New Kings mock draft, Warriors send message and major changes in Utah

The Deuce & Mo Podcast

Play Episode Listen Later Jun 6, 2022 77:07


Deuce and Mo return to talk NBA Finals, a new Kings mock draft, Snyder out in Utah, Mitchell's future and shortening the NBA season. 00:57-Deuce playing hurt 3:38-Reaction to Game 2 of the NBA Finals 33:16-SI mock draft has Keegan to the Kings and thoughts on Jaden Ivey 55:59-Snyder resigns and Donovan Mitchell's future 1:06:50-Should the NBA shorten the season? See omnystudio.com/listener for privacy information.