POPULARITY
Infants born by vaginal birth are exposed to maternal vaginal bacteria, which are one of the contributing influences on the subsequent development of the infant's microbiome. This process is altered by cesarean delivery, which changes the initial microbiome of the neonate. It is theorized that infants born by cesarean delivery have an increased risk of chronic inflammatory conditions due to altered early-life microbiome colonization, with associated aberrant immune and metabolic development. Vaginal seeding is the practice of inoculating an infant born by cesarean section with a sampling of fluid, with the use of a guaze, from the vagina of the mother over the child's face, mouth, and nares. This is performed to introduce the neonate to the mother's vaginal flora for presumed better health outcomes. Although cautionary statements have been published about this practice, it remains very popular. In Feb 2025, a “viewpoint” was published in JAMA Pediatrics which has brough vaginal seeding back into the limelight. Does this work? What are the official statements about this from the ACOG and AAP? Is there a way to do this “safely”? We will cover this new publication, review the official professional society's statements….and more, in this episode.
This week was wild! JKJ set her 2025 goals, our kids have shows, the McRib is back, Moana 2 is just OK and WE ALMOST DIED. Ok so we weren't THAT close to dying but it was kind of insane and smoke was involved. We answer your questions AND we have fun gift recommendations! It's been a wonderful season! Thank Upcoming shows: Jan 10-11 Las Vegas, NV Jan 18 Ft. Wayne, IN Feb 14-15 Massillon, OH Feb 28 Piqua, OH March 1 Celina, OH March 28-29 West Jordan, UT For more JKJ and ticket links to all shows visit jennakimjones.com
It's our Christmas episode and in it we discuss all our favorite Christmas things and the ones we could live without. Turns out JKJ has some strong opinions about Christmas movies (surprise, surprise!) and #AL takes all of JKJ's very silly questions very seriously. Upcoming shows: Jan 10-11 Las Vegas, NV Jan 18 Ft. Wayne, IN Feb 14-15 Massillon, OH Feb 28 Piqua, OH March 1 Celina, OH March 28-29 West Jordan, UT For more JKJ and ticket links to all shows visit jennakimjones.com
In Feb 2024 the Guardian Newspaper published an article entitled, “More than half of Tory members in poll say Islam a threat to British way of life”. It cited two polls: an Opinium poll of 521 Conservative party members from 7-16 February 2024, and another poll of 25,000 by 'Hope Not Hate' December 2023 – January 2024. Results showed that 58% of conservative MPs and 30% of the UK public think that Islam is a threat to the British way of Life. Are these views valid? What are British values and the British way of life? Does Islamic teaching, and Muslims living in Britain pose a threat to this? We examine these questions in this week's episode of Pathway to Peace, highlighting Islamic values and British values are aligned. Presenters: Arif Khan & Sufyan Farooqi
Henry Akins began training in Brazilian Jiu-jitsu in 1995 at the Rickson Gracie Acadamy on Pico blvd in West LA. Shortly after he started he became the secretary at the academy and was spending 70 hours a week there watching and participating in all of the classes. In Feb of 2004, because of his persistence and dedication to the fundamentals and philosophies of Jiu-Jitsu, Rickson Gracie presented Henry with a blackbelt, being only the third American at the time to receive that honor. Mickey Schuch is the owner and an instructor at Carry Trainer, an organization designed to provide a holistic approach to self-defense and firearms training. Mickey is a firearm instructor for Pistol, Rifle, Shotguns, Personal Protection Inside the Home and Personal Protection Outside the Home, a Range Safety Officer, and a member of the Illinois Tactical Officers Association. Carry Trainer provides real solutions for real problems. It is incumbent upon individuals, businesses, and organizations to protect and defend innocent life. Carry Training provides training for individuals who are looking to make themselves the best that they can be. There is much more to self-defense than obtaining a permit to carry a gun. Training in any martial skill is more than attending a class, it is a lifestyle choice and the decision to exercise the right of arming oneself for the defense of themselves and family should not be taken lightly. Bullets cannot be called back once loosed from the gun. Safety is not an inherent trait; it is learned through proper training and repetition. Wolf 21- Check out what they have to offer for the best sleep of your life: https://www.thewolf21.com Use Code: "clearedhot" for 30% off of your 1st order The Speed of War Comic Series: https://www.thespeedofwar.com/ Check out the newest Cleared Hot Gear here: https://shop.clearedhotpodcast.com/
In Colombia. In Feb. On a sun-splashed beach. Ice cold beer in hand. I found a BANGING podcast by DJ and producer called @djwillclarke Devoured it. Will's interviewed some absolutely WHOPPA guests and DJs @carlcoxofficial @moby@patricktopping @joshwink1 @waffdj @scubaofficial @dubfire @dannyhowarddj Will owns record label @allwehaveisnowofficial + released on Trick, Drumcode, Filth on Acid, PolydorHit 100M streams globally and appeared at the worlds largest festivals. Will's also gotta food page called @willmakesyouhungry (yup…bingo… there's the food link)Globe Galavanting takes Will to some of THE best restaurantsOur convo is jam-packed with these-are-a-few-of-my-favourite-things Rowntree Randoms. Celebrations Chocolate Box of all my Curiosities. Music, Podcasting, The Creative Process, Techno, Hustle, Making Things Happen, Branding, Business… oh! and, of course, Food…Glorious…FoodEven if you're not planning to be a SuperStar DJ (I still am: pending) (Dropped my filthy tech-house Dido “White Flag” mix at Village Hall last week - OOOOFFF - get ya chops round that ) You'll still bloody LOVE this. ON THE MENU: 1. Why “The audience comes last, Feel the temperature of the room” - Rory Sutherland would be all over this2. Why Consistency of brand, allows audience (and consumers) to feel comfortable3. The Grit Toolkit to Keep Going: There are dark parts of every single day, Proof hard work pays off if you don't change the goal posts4. Sales Masterclass: How Will landed a contract with Dirty Bird Records: “Business is compromise, what's their goal, what's your goal” + “Make people think it's their idea”5. Why Will's record label is called “All We Have Is Now” - the difference on Being vs. Becoming 6. Rick Rubin Creative Genius: the deeper the roots go the more it blossoms out the other side7. Creative Teams Need 1. Absolutely No Expectations 2. You need a lot of sh*t to find goldFull episode live Monday 8am
Welcome to The Daily Wrap Up, a concise show dedicated to bringing you the most relevant independent news, as we see it, from the last 24 hours (5/10/24). As always, take the information discussed in the video below and research it for yourself, and come to your own conclusions. Anyone telling you what the truth is, or claiming they have the answer, is likely leading you astray, for one reason or another. Stay Vigilant. !function(r,u,m,b,l,e){r._Rumble=b,r[b]||(r[b]=function(){(r[b]._=r[b]._||[]).push(arguments);if(r[b]._.length==1){l=u.createElement(m),e=u.getElementsByTagName(m)[0],l.async=1,l.src="https://rumble.com/embedJS/u2q643"+(arguments[1].video?'.'+arguments[1].video:'')+"/?url="+encodeURIComponent(location.href)+"&args="+encodeURIComponent(JSON.stringify([].slice.apply(arguments))),e.parentNode.insertBefore(l,e)}})}(window, document, "script", "Rumble"); Rumble("play", {"video":"v4rxlct","div":"rumble_v4rxlct"}); Video Source Links (In Chronological Order): Medical Coder Blows The Whistle On The COVID-19 Illusion Pfizer agrees to settle 10,000 lawsuits accusing pharma giant of hiding cancer risks of heartburn drug Zantac (30) Taylor Hudak on X: "Do not fall for this new, scripted re-writing of history regarding C19 jabs In Feb 2021, several drs & scientists wrote to EMA warning about the risks of blood clots, strokes, autoimmune disease & more in an effort to stop the harm! The EMA did nothing! https://t.co/C3Vi0jbkxG https://t.co/PpxU7g9YL6" / X (30) Tommy Mac on X: "I'm shocked, shocked I tell you! "Elon Musk's Neuralink chip malfunctions in first in-human brain implant" https://t.co/WRGmyqm0Vt via @nypost" / X Elon Musk's Neuralink chip suffers unexpected setback in first in-human brain implant Bethany Christian Services (32) The Last American Vagabond on X: "In the last week TLAV has been blocked from streaming on @instagram & unceremoniously deleted from @tiktok_us with zero explanation. This while we are being told it is ALLOWING negative content about Israel.. I guess until you prove it with source material, as usual. Nothing new. https://t.co/DHvn6U0MyH" / X New Tab (30) The Last American Vagabond on X: ""I have never seen a similar case of hundreds of bees attacking one person," Well they do say bees can smell fear
Niall Norton began his career training as an accountant and once qualified, followed this up with a stint in Deloitte Corporate Finance before taking up the reigns as CFO with Irish operations of O2, where he was responsible for financial control functions, business process re-design, strategy planning and wholesale. In Feb 2004 he joined Openet as CFO, a company extremely proud of its Irish heritage with a unique ethos that is harnessed to drive success and become a global player in the telecoms industry. In September 2006 he became CEO of the company and under his stewardship, it grew from a small Irish software company to a global international operation with revenues of circa $120m and employed more than 800 people. He remained in this position until August 2020 when the company was bought by US listed Amdocs, which provides software and services to the telecoms and media sectors globally. After the takeover, Niall was Division President and General Manager for Amdocs Networks. In this role, Niall had responsibility for rebooting the commercial strategy related to Amdocs portfolio of software and services for the network market domain. Niall was recognized in 2017 as one of the 100 influential people in telecoms by Global Telecoms Business. Earlier this year Niall Norton established Clever Communications, (“C-Comms”), a consulting company providing advisory services to client companies planning or undergoing scaling challenges. This is the 26th episode with guest Niall Norton, Adviser and Mentor, in the Davy podcast series 'Everyday Business with Aidan Donnelly'. This podcast brings you insightful conversation between Aidan Donnelly and entrepreneurs and business owners/management with their own unique story to tell. If you like what you hear, please like, share and subscribe.
****Trigger Warning!**** I DO NOT RECOMMEND THIS EP FOR ANYONE UNDER THE AGE OF 18. In Feb 2021 a woman named Terri Cohee finds her worst nightmare in her 19 yr old son's bedroom. A bag filled with a human head! What happened the night before? What did her son do? Brian Cohee Jr committed one of the worst murders. When you hear the very detailed confession from this man it will leave you with nightmares. His morbid humor and obsession with serial killers and crime led him to dream about committing a murder he would be infamous for. I also talk about some superstitions and legends regarding solar Eclipse from the past and even still today. All this and more, so come join in if you dare!! Make sure to follow Creepy Chisme on Instagram and TikTok. You can also buy me a coffee at buymeacoffee.com/creepychisme and help support the show!
I broke my streak of 4 years! I have been doing this virtual fitness thing for 4 years, and last week was the first time I had to cancel a class for my clients.Let me tell you where it all started... in 2019 I was in the early stages of my business. I was offering nutrition coaching to clients who were looking for support for weight management. I was always interested in fitness, but didn't have the credentials to support my clients in the way I wanted to, so I pursued my Personal Trainer Certificate. I took a course, went to training in Boston for a weekend & studied my tail off all summer. Early in Sept. 2019 I took & passed my certification exam & was officially a CPT.My intention was never to be a group fitness instructor, and I really wasn't even interested in providing 1:1 personal training. Initially I started offering written fitness plans for my clients to follow. I started to do some 1:1 training at a local gym that just opened up & that sparked some interest in a group fitness class. In Feb 2020 I was just getting started doing a couple of fitness classes a week & found I REALLY enjoyed this- I was planning for a virtual fitness program BEFORE the world stopped in March of 2020, but I know for certain the events surrounding the pandemic helped to propel my business.Fast forward 4 years- I continue to offer this virtual fitness program- it has become the majority of the business I do in Inspire. I absolutely LOVE it! The Inspire Fitness program has grown to over 550 members. My clients join me all over the country & internationally. Women workout with me from the comfort of their own homes. It is a convenient & efficient/effective way for them to get fit. For the past 4 years I've offered 2-4 LIVE classes each week & my library has grown as each class is recorded. Each week I write a schedule & each day I show up for my clients & do the workout. It wasn't my intent to have this “streak” I just kept showing up consistently & doing what I said I would do. It is important to me to be dependable to my clients & be there for them when I say I will be. Thankfully, I've avoided major illness or injuries and for that I am incredibly grateful!This past week I had a family situation arise. I was traveling for a work conference & planning to come home Wednesday evening. I had a workout scheduled early Thursday morning. Without sharing too much personal information, I made the decision to make my family a priority & I extended my trip to see my Aunt who is very ill. It really wasn't a difficult decision for me- but I had to let my clients know it was going to impact the weekly schedule & I would have to cancel Thursday morning's class. I was disappointed to have to break my streak & struggled to admit I was not in fact, invincible. However, I was BLOWN AWAY at the incredible support that was offered from the women in my fitness community. And it felt pretty incredible to be lifted up during a difficult personal time by my own clients- the very individuals I strive to help on a daily basis. So, the 4 year streak has ended, and it is ok! Thank you to my clients for your unbelievable support. Thank you to all who have been a part of this journey. I look forward to continuing to do this work as it is my passion!If you would like to join us in the Inspire Fitness membership, I welcome you! Please find details here: https://inspirehw.com/inspire-fitnessFollow me on Instagram! https://www.instagram.com/fit.nutritionist?igsh=MTJqZXhjODR2Z
Speaker CFPs and Sponsor Guides are now available for AIE World's Fair — join us on June 25-27 for the biggest AI Engineer conference of 2024!Soumith Chintala needs no introduction in the ML world — his insights are incredibly accessible across Twitter, LinkedIn, podcasts, and conference talks (in this pod we'll assume you'll have caught up on the History of PyTorch pod from last year and cover different topics). He's well known as the creator of PyTorch, but he's more broadly the Engineering Lead on AI Infra, PyTorch, and Generative AI at Meta.Soumith was one of the earliest supporters of Latent Space (and more recently AI News), and we were overjoyed to catch up with him on his latest SF visit for a braindump of the latest AI topics, reactions to some of our past guests, and why Open Source AI is personally so important to him.Life in the GPU-Rich LaneBack in January, Zuck went on Instagram to announce their GPU wealth: by the end of 2024, Meta will have 350k H100s. By adding all their GPU clusters, you'd get to 600k H100-equivalents of compute. At FP16 precision, that's ~1,200,000 PFLOPS. If we used George Hotz's (previous guest!) "Person of Compute" measure, Meta now has 60k humans of compute in their clusters. Occasionally we get glimpses into the GPU-rich life; on a recent ThursdAI chat, swyx prompted PaLM tech lead Yi Tay to write down what he missed most from Google, and he commented that UL2 20B was trained by accidentally leaving the training job running for a month, because hardware failures are so rare in Google.Meta AI's Epic LLM RunBefore Llama broke the internet, Meta released an open source LLM in May 2022, OPT-175B, which was notable for how “open” it was - right down to the logbook! They used only 16 NVIDIA V100 GPUs and Soumith agrees that, with hindsight, it was likely under-trained for its parameter size.In Feb 2023 (pre Latent Space pod), Llama was released, with a 7B version trained on 1T tokens alongside 65B and 33B versions trained on 1.4T tokens. The Llama authors included Guillaume Lample and Timothée Lacroix, who went on to start Mistral.July 2023 was Llama2 time (which we covered!): 3 model sizes, 7B, 13B, and 70B, all trained on 2T tokens. The three models accounted for a grand total of 3,311,616 GPU hours for all pre-training work. CodeLlama followed shortly after, a fine-tune of Llama2 specifically focused on code generation use cases. The family had models in the 7B, 13B, 34B, and 70B size, all trained with 500B extra tokens of code and code-related data, except for 70B which is trained on 1T.All of this on top of other open sourced models like Segment Anything (one of our early hits!), Detectron, Detectron 2, DensePose, and Seamless, and in one year, Meta transformed from a company people made fun of for its “metaverse” investments to one of the key players in the AI landscape and its stock has almost tripled since (about $830B in market value created in the past year).Why Open Source AIThe obvious question is why Meta would spend hundreds of millions on its AI efforts and then release them for free. Zuck has addressed this in public statements:But for Soumith, the motivation is even more personal:“I'm irrationally interested in open source. I think open source has that fundamental way to distribute opportunity in a way that is very powerful. Like, I grew up in India… And knowledge was very centralized, but I saw that evolution of knowledge slowly getting decentralized. And that ended up helping me learn quicker and faster for like zero dollars. And I think that was a strong reason why I ended up where I am. So like that, like the open source side of things, I always push regardless of like what I get paid for, like I think I would do that as a passion project on the side……I think at a fundamental level, the most beneficial value of open source is that you make the distribution to be very wide. It's just available with no friction and people can do transformative things in a way that's very accessible. Maybe it's open source, but it has a commercial license and I'm a student in India. I don't care about the license. I just don't even understand the license. But like the fact that I can use it and do something with it is very transformative to me……Like, okay, I again always go back to like I'm a student in India with no money. What is my accessibility to any of these closed source models? At some scale I have to pay money. That makes it a non-starter and stuff. And there's also the control issue: I strongly believe if you want human aligned AI, you want all humans to give feedback. And you want all humans to have access to that technology in the first place. And I actually have seen, living in New York, whenever I come to Silicon Valley, I see a different cultural bubble.We like the way Soumith put it last year: Closed AI “rate-limits against people's imaginations and needs”!What It Takes For Open Source AI to WinHowever Soumith doesn't think Open Source will simply win by popular demand. There is a tremendous coordination problem with the decentralized nature of the open source AI development right now: nobody is collecting the valuable human feedback in the way that OpenAI or Midjourney are doing.“Open source in general always has a coordination problem. If there's a vertically integrated provider with more resources, they will just be better coordinated than open source. And so now open source has to figure out how to have coordinated benefits. And the reason you want coordinated benefits is because these models are getting better based on human feedback. And if you see with open source models, like if you go to the /r/localllama subreddit, like there's so many variations of models that are being produced from, say, Nous research. I mean, like there's like so many variations built by so many people. And one common theme is they're all using these fine-tuning or human preferences datasets that are very limited and they're not sufficiently diverse. And you look at the other side, say front-ends like Oobabooga or like Hugging Chat or Ollama, they don't really have feedback buttons. All the people using all these front-ends, they probably want to give feedback, but there's no way for them to give feedback… So we're just losing all of this feedback. Maybe open source models are being as used as GPT is at this point in like all kinds of, in a very fragmented way, like in aggregate all the open source models together are probably being used as much as GPT is, maybe close to that. But the amount of feedback that is driving back into the open source ecosystem is like negligible, maybe less than 1% of like the usage. So I think like some, like the blueprint here I think is you'd want someone to create a sinkhole for the feedback… I think if we do that, if that actually happens, I think that probably has a real chance of the open source models having a runaway effect against OpenAI, I think like there's a clear chance we can take at truly winning open source.”If you're working on solving open source coordination, please get in touch!Show Notes* Soumith Chintala Twitter* History of PyTorch episode on Gradient Podcast* The Llama Ecosystem* Apple's MLX* Neural ODEs (Ordinary Differential Equations)* AlphaGo* LMSys arena* Dan Pink's "Drive"* Robotics projects:* Dobb-E* OK Robot* Yann LeCun* Yangqing Jia of Lepton AI* Ed Catmull* George Hotz on Latent Space* Chris Lattner on Latent Space* Guillaume Lample* Yannic Kilcher of OpenAssistant* LMSys* Alex Atallah of OpenRouter* Carlo Sferrazza's 3D tactile research* Alex Wiltschko of Osmo* Tangent by Alex Wiltschko* Lerrel Pinto - RoboticsTimestamps* [00:00:00] Introductions* [00:00:51] Extrinsic vs Intrinsic Success* [00:02:40] Importance of Open Source and Its Impact* [00:03:46] PyTorch vs TinyGrad* [00:08:33] Why PyTorch is the Switzerland of frameworks* [00:10:27] Modular's Mojo + PyTorch?* [00:13:32] PyTorch vs Apple's MLX* [00:16:27] FAIR / PyTorch Alumni* [00:18:50] How can AI inference providers differentiate?* [00:21:41] How to build good benchmarks and learnings from AnyScale's* [00:25:28] Most interesting unexplored ideas* [00:28:18] What people get wrong about synthetic data* [00:35:57] Meta AI's evolution* [00:38:42] How do you allocate 600,000 GPUs?* [00:42:05] Even the GPU Rich are GPU Poor* [00:47:31] Meta's MTIA silicon* [00:50:09] Why we need open source* [00:59:00] Open source's coordination problem for feedback gathering* [01:08:59] Beyond text generation* [01:15:37] Osmo and the Future of Smell Recognition TechnologyTranscriptAlessio [00:00:00]: Hey everyone, welcome to the Latent Space podcast. This is Alessio, partner and CTO in residence at Decibel Partners, and I'm joined by my co-host Swyx, founder of Smol AI.Swyx [00:00:15]: Hey, and today we have in the studio Soumith Chintala, welcome.Soumith [00:00:17]: Thanks for having me.Swyx [00:00:18]: On one of your rare visits from New York where you live. You got your start in computer vision at NYU with Yann LeCun. That was a very fortuitous start. I was actually listening to your interview on the Gradient podcast. So if people want to know more about the history of Soumith, history of PyTorch, they can go to that podcast. We won't spend that much time there, but I just was marveling at your luck, or I don't know if it's your luck or your drive to find AI early and then find the right quality mentor because I guess Yan really sort of introduced you to that world.Soumith [00:00:51]: Yeah, I think you're talking about extrinsic success, right? A lot of people just have drive to do things that they think is fun, and a lot of those things might or might not be extrinsically perceived as good and successful. I think I just happened to like something that is now one of the coolest things in the world or whatever. But if I happen, the first thing I tried to become was a 3D VFX artist, and I was really interested in doing that, but I turned out to be very bad at it. So I ended up not doing that further. But even if I was good at that, whatever, and I ended up going down that path, I probably would have been equally happy. It's just like maybe like the perception of, oh, is this person successful or not might be different. I think like after a baseline, like your happiness is probably more correlated with your intrinsic stuff.Swyx [00:01:44]: Yes. I think Dan Pink has this book on drive that I often refer to about the power of intrinsic motivation versus extrinsic and how long extrinsic lasts. It's not very long at all. But anyway, now you are an investor in Runway, so in a way you're working on VFX. Yes.Soumith [00:02:01]: I mean, in a very convoluted way.Swyx [00:02:03]: It reminds me of Ed Catmull. I don't know if you guys know, but he actually tried to become an animator in his early years and failed or didn't get accepted by Disney and then went and created Pixar and then got bought by Disney and created Toy Story. So you joined Facebook in 2014 and eventually became a creator and maintainer of PyTorch. And there's this long story there you can refer to on the gradient. I think maybe people don't know that you also involved in more sort of hardware and cluster decision affair. And we can dive into more details there because we're all about hardware this month. Yeah. And then finally, I don't know what else, like what else should people know about you on a personal side or professional side?Soumith [00:02:40]: I think open source is definitely a big passion of mine and probably forms a little bit of my identity at this point. I'm irrationally interested in open source. I think open source has that fundamental way to distribute opportunity in a way that is very powerful. Like, I grew up in India. I didn't have internet for a while. In college, actually, I didn't have internet except for GPRS or whatever. And knowledge was very centralized, but I saw that evolution of knowledge slowly getting decentralized. And that ended up helping me learn quicker and faster for zero dollars. And I think that was a strong reason why I ended up where I am. So the open source side of things, I always push regardless of what I get paid for, like I think I would do that as a passion project on the side.Swyx [00:03:35]: Yeah, that's wonderful. Well, we'll talk about the challenges as well that open source has, open models versus closed models. Maybe you want to touch a little bit on PyTorch before we move on to the sort of Meta AI in general.PyTorch vs Tinygrad tradeoffsAlessio [00:03:46]: Yeah, we kind of touched on PyTorch in a lot of episodes. So we had George Hotz from TinyGrad. He called PyTorch a CISC and TinyGrad a RISC. I would love to get your thoughts on PyTorch design direction as far as, I know you talk a lot about kind of having a happy path to start with and then making complexity hidden away but then available to the end user. One of the things that George mentioned is I think you have like 250 primitive operators in PyTorch, I think TinyGrad is four. So how do you think about some of the learnings that maybe he's going to run into that you already had in the past seven, eight years almost of running PyTorch?Soumith [00:04:24]: Yeah, I think there's different models here, but I think it's two different models that people generally start with. Either they go like, I have a grand vision and I'm going to build a giant system that achieves this grand vision and maybe one is super feature complete or whatever. Or other people say they will get incrementally ambitious, right? And they say, oh, we'll start with something simple and then we'll slowly layer out complexity in a way that optimally applies Huffman coding or whatever. Like where the density of users are and what they're using, I would want to keep it in the easy, happy path and where the more niche advanced use cases, I'll still want people to try them, but they need to take additional frictional steps. George, I think just like we started with PyTorch, George started with the incrementally ambitious thing. I remember TinyGrad used to be, like we would be limited to a thousand lines of code and I think now it's at 5,000. So I think there is no real magic to which why PyTorch has the kind of complexity. I think it's probably partly necessitated and partly because we built with the technology available under us at that time, PyTorch is like 190,000 lines of code or something at this point. I think if you had to rewrite it, we would probably think about ways to rewrite it in a vastly simplified way for sure. But a lot of that complexity comes from the fact that in a very simple, explainable way, you have memory hierarchies. You have CPU has three levels of caches and then you have DRAM and SSD and then you have network. Similarly, GPU has several levels of memory and then you have different levels of network hierarchies, NVLink plus InfiniBand or Rocky or something like that, right? And the way the flops are available on your hardware, they are available in a certain way and your computation is in a certain way and you have to retrofit your computation onto both the memory hierarchy and like the flops available. When you're doing this, it is actually a fairly hard mathematical problem to do this setup, like you find the optimal thing. And finding the optimal thing is, what is optimal depends on the input variables themselves. So like, okay, what is the shape of your input tensors and what is the operation you're trying to do and various things like that. Finding that optimal configuration and writing it down in code is not the same for every input configuration you have. Like for example, just as the shape of the tensors change, let's say you have three input tensors into a Sparstar product or something like that. The shape of each of these input tensors will vastly change how you do this optimally placing this operation onto the hardware in a way that will get you maximal throughput. So a lot of our complexity comes from writing out hundreds of configurations for each single PyTorch operator and templatizing these things and symbolically generating the final CUDA code or CPU code. There's no way to avoid it because mathematically we haven't found symbolic ways to do this that also keep compile time near zero. You can write a very simple framework, but then you also should be willing to eat the long compile time. So if searching for that optimal performance at runtime, but that's the trade off. There's no, like, I don't think unless we have great breakthroughs George's vision is achievable, he should be thinking about a narrower problem such as I'm only going to make this for work for self-driving car connets or I'm only going to make this work for LLM transformers of the llama style. Like if you start narrowing the problem down, you can make a vastly simpler framework. But if you don't, if you need the generality to power all of the AI research that is happening and keep zero compile time and in all these other factors, I think it's not easy to avoid the complexity.Pytorch vs MojoAlessio [00:08:33]: That's interesting. And we kind of touched on this with Chris Lattner when he was on the podcast. If you think about frameworks, they have the model target. They have the hardware target. They have different things to think about. He mentioned when he was at Google, TensorFlow trying to be optimized to make TPUs go brr, you know, and go as fast. I think George is trying to make especially AMD stack be better than ROCm. How come PyTorch has been such as Switzerland versus just making Meta hardware go brr?Soumith [00:09:00]: First, Meta is not in the business of selling hardware. Meta is not in the business of cloud compute. The way Meta thinks about funding PyTorch is we're funding it because it's net good for Meta to fund PyTorch because PyTorch has become a standard and a big open source project. And generally it gives us a timeline edge. It gives us leverage and all that within our own work. So why is PyTorch more of a Switzerland rather than being opinionated? I think the way we think about it is not in terms of Switzerland or not. We actually the way we articulate it to all hardware vendors and software vendors and all who come to us being we want to build a backend in core for PyTorch and ship it by default is we just only look at our user side of things. Like if users are using a particular piece of hardware, then we want to support it. We very much don't want to king make the hardware side of things. So as the MacBooks have GPUs and as that stuff started getting increasingly interesting, we pushed Apple to push some engineers and work on the NPS support and we spend significant time from Meta funded engineers on that as well because a lot of people are using the Apple GPUs and there's demand. So we kind of mostly look at it from the demand side. We never look at it from like oh which hardware should we start taking opinions on.Swyx [00:10:27]: Is there a future in which, because Mojo or Modular Mojo is kind of a superset of Python, is there a future in which PyTorch might use Mojo features optionally?Soumith [00:10:36]: I think it depends on how well integrated it is into the Python ecosystem. So if Mojo is like a pip install and it's readily available and users feel like they can use Mojo so smoothly within their workflows in a way that just is low friction, we would definitely look into that. Like in the same way PyTorch now depends on Triton, OpenAI Triton, and we never had a conversation that was like huh, that's like a dependency. Should we just build a Triton of our own or should we use Triton? It almost doesn't, like those conversations don't really come up for us. The conversations are more well does Triton have 10,000 dependencies and is it hard to install? We almost don't look at these things from a strategic leverage point of view. We look at these things from a user experience point of view, like is it easy to install? Is it smoothly integrated and does it give enough benefits for us to start depending on it? If so, yeah, we should consider it. That's how we think about it.Swyx [00:11:37]: You're inclusive by default as long as it meets the minimum bar of, yeah, but like maybe I phrased it wrongly. Maybe it's more like what problems would you look to solve that you have right now?Soumith [00:11:48]: I think it depends on what problems Mojo will be useful at.Swyx [00:11:52]: Mainly a performance pitch, some amount of cross compiling pitch.Soumith [00:11:56]: Yeah, I think the performance pitch for Mojo was like, we're going to be performant even if you have a lot of custom stuff, you're going to write arbitrary custom things and we will be performant. And that value proposition is not clear to us from the PyTorch side to consider it for PyTorch. So PyTorch, it's actually not 250 operators, it's like a thousand operators. PyTorch exposes about a thousand operators and people kind of write their ideas in the thousand operators of PyTorch. Mojo is like, well, maybe it's okay to completely sidestep those thousand operators of PyTorch and just write it in a more natural form. Just write raw Python, write for loops or whatever, right? So from the consideration of how do we intersect PyTorch with Mojo, I can see one use case where you have custom stuff for some parts of your program, but mostly it's PyTorch. And so we can probably figure out how to make it easier for say Torch.compile to smoothly also consume Mojo subgraphs and like, you know, the interoperability being actually usable, that I think is valuable. But Mojo as a fundamental front end would be replacing PyTorch, not augmenting PyTorch. So in that sense, I don't see a synergy in more deeply integrating Mojo.Pytorch vs MLXSwyx [00:13:21]: So call out to Mojo whenever they have written something in Mojo and there's some performance related thing going on. And then since you mentioned Apple, what should people think of PyTorch versus MLX?Soumith [00:13:32]: I mean, MLX is early and I know the folks well, Ani used to work at FAIR and I used to chat with him all the time. He used to be based out of New York as well. The way I think about MLX is that MLX is specialized for Apple right now. It has a happy path because it's defined its product in a narrow way. At some point MLX either says we will only be supporting Apple and we will just focus on enabling, you know, there's a framework if you use your MacBook, but once you like go server side or whatever, that's not my problem and I don't care. For MLS, it enters like the server side set of things as well. Like one of these two things will happen, right? If the first thing will happen, like MLX's overall addressable market will be small, but it probably do well within that addressable market. If it enters the second phase, they're going to run into all the same complexities that we have to deal with. They will not have any magic wand and they will have more complex work to do. They probably wouldn't be able to move as fast.Swyx [00:14:44]: Like having to deal with distributed compute?Soumith [00:14:48]: Distributed, NVIDIA and AMD GPUs, like just like having a generalization of the concept of a backend, how they treat compilation with plus overheads. Right now they're deeply assumed like the whole NPS graph thing. So they need to think about all these additional things if they end up expanding onto the server side and they'll probably build something like PyTorch as well, right? Like eventually that's where it will land. And I think there they will kind of fail on the lack of differentiation. Like it wouldn't be obvious to people why they would want to use it.Swyx [00:15:24]: I mean, there are some cloud companies offering M1 and M2 chips on servers. I feel like it might be interesting for Apple to pursue that market, but it's not their core strength.Soumith [00:15:33]: Yeah. If Apple can figure out their interconnect story, maybe, like then it can become a thing.Swyx [00:15:40]: Honestly, that's more interesting than the cars. Yes.Soumith [00:15:43]: I think the moat that NVIDIA has right now, I feel is that they have the interconnect that no one else has, like AMD GPUs are pretty good. I'm sure there's various silicon that is not bad at all, but the interconnect, like NVLink is uniquely awesome. I'm sure the other hardware providers are working on it, but-Swyx [00:16:04]: I feel like when you say it's uniquely awesome, you have some appreciation of it that the rest of us don't. I mean, the rest of us just like, you know, we hear marketing lines, but what do you mean when you say NVIDIA is very good at networking? Obviously they made the acquisition maybe like 15 years ago.Soumith [00:16:15]: Just the bandwidth it offers and the latency it offers. I mean, TPUs also have a good interconnect, but you can't buy them. So you have to go to Google to use it.PyTorch MafiaAlessio [00:16:27]: Who are some of the other FAIR PyTorch alumni that are building cool companies? I know you have Fireworks AI, Lightning AI, Lepton, and Yangqing, you knew since college when he was building Coffee?Soumith [00:16:40]: Yeah, so Yangqing and I used to be framework rivals, PyTorch, I mean, we were all a very small close-knit community back then. Caffe, Torch, Theano, Chainer, Keras, various frameworks. I mean, it used to be more like 20 frameworks. I can't remember all the names. CCV by Liu Liu, who is also based out of SF. And I would actually like, you know, one of the ways it was interesting is you went into the framework guts and saw if someone wrote their own convolution kernel or they were just copying someone else's. There were four or five convolution kernels that were unique and interesting. There was one from this guy out of Russia, I forgot the name, but I remembered who was awesome enough to have written their own kernel. And at some point there, I built out these benchmarks called ConNet benchmarks. They're just benchmarking all the convolution kernels that are available at that time. It hilariously became big enough that at that time AI was getting important, but not important enough that industrial strength players came in to do these kinds of benchmarking and standardization. Like we have MLPerf today. So a lot of the startups were using ConNet benchmarks in their pitch decks as like, oh, you know, on ConNet benchmarks, this is how we fare, so you should fund us. I remember Nirvana actually was at the top of the pack because Scott Gray wrote amazingly fast convolution kernels at that time. Very interesting, but separate times. But to answer your question, Alessio, I think mainly Lepton, Fireworks are the two most obvious ones, but I'm sure the fingerprints are a lot wider. They're just people who worked within the PyTorch Cafe2 cohort of things and now end up at various other places.Swyx [00:18:50]: I think as a, both as an investor and a people looking to build on top of their services, it's a uncomfortable slash like, I don't know what I don't know pitch. Because I've met Yang Tsing and I've met Lin Chao. Yeah, I've met these folks and they're like, you know, we are deep in the PyTorch ecosystem and we serve billions of inferences a day or whatever at Facebook and now we can do it for you. And I'm like, okay, that's great. Like, what should I be wary of or cautious of when these things happen? Because I'm like, obviously this experience is extremely powerful and valuable. I just don't know what I don't know. Like, what should people know about like these sort of new inference as a service companies?Soumith [00:19:32]: I think at that point you would be investing in them for their expertise of one kind. So if they've been at a large company, but they've been doing amazing work, you would be thinking about it as what these people bring to the table is that they're really good at like GPU programming or understanding the complexity of serving models once it hits a certain scale. You know, various expertise like from the infra and AI and GPUs point of view. What you would obviously want to figure out is whether their understanding of the external markets is clear, whether they know and understand how to think about running a business, understanding how to be disciplined about making money or, you know, various things like that.Swyx [00:20:23]: Maybe I'll put it like, actually I will de-emphasize the investing bit and just more as a potential customer. Oh, okay. Like, it's more okay, you know, you have PyTorch gods, of course. Like, what else should I know?Soumith [00:20:37]: I mean, I would not care about who's building something. If I'm trying to be a customer, I would care about whether...Swyx [00:20:44]: Benchmarks.Soumith [00:20:44]: Yeah, I use it and it's usability and reliability and speed, right?Swyx [00:20:51]: Quality as well.Soumith [00:20:51]: Yeah, if someone from some random unknown place came to me and say, user stuff is great. Like, and I have the bandwidth, I probably will give it a shot. And if it turns out to be great, like I'll just use it.Benchmark dramaSwyx [00:21:07]: Okay, great. And then maybe one more thing about benchmarks, since we already brought it up and you brought up Confident Benchmarks. There was some recent drama around AnyScale. AnyScale released their own benchmarks and obviously they look great on their own benchmarks, but maybe didn't give the other... I feel there are two lines of criticism. One, which is they didn't test some apples for apples on the kind of endpoints that the other providers, that they are competitors with, on their benchmarks and that is due diligence baseline. And then the second would be more just optimizing for the right thing. You had some commentary on it. I'll just kind of let you riff.Soumith [00:21:41]: Yeah, I mean, in summary, basically my criticism of that was AnyScale built these benchmarks for end users to just understand what they should pick, right? And that's a very good thing to do. I think what they didn't do a good job of is give that end user a full understanding of what they should pick. Like they just gave them a very narrow slice of understanding. I think they just gave them latency numbers and that's not sufficient, right? You need to understand your total cost of ownership at some reasonable scale. Not oh, one API call is one cent, but a thousand API calls are 10 cents. Like people can misprice to cheat on those benchmarks. So you want to understand, okay, like how much is it going to cost me if I actually subscribe to you and do like a million API calls a month or something? And then you want to understand the latency and reliability, not just from one call you made, but an aggregate of calls you've made over several various times of the day and times of the week. And the nature of the workloads, is it just some generic single paragraph that you're sending that is cashable? Or is it like testing of real world workload? I think that kind of rigor, like in presenting that benchmark wasn't there. It was a much more narrow sliver of what should have been a good benchmark. That was my main criticism. And I'm pretty sure if before they released it, they showed it to their other stakeholders who would be caring about this benchmark because they are present in it, they would have easily just pointed out these gaps. And I think they didn't do that and they just released it. So I think those were the two main criticisms. I think they were fair and Robert took it well.Swyx [00:23:40]: And he took it very well. And we'll have him on at some point and we'll discuss it. But I think it's important for, I think the market being maturing enough that people start caring and competing on these kinds of things means that we need to establish what best practice is because otherwise everyone's going to play dirty.Soumith [00:23:55]: Yeah, absolutely. My view of the LLM inference market in general is that it's the laundromat model. Like the margins are going to drive down towards the bare minimum. It's going to be all kinds of arbitrage between how much you can get the hardware for and then how much you sell the API and how much latency your customers are willing to let go. You need to figure out how to squeeze your margins. Like what is your unique thing here? Like I think Together and Fireworks and all these people are trying to build some faster CUDA kernels and faster, you know, hardware kernels in general. But those modes only last for a month or two. These ideas quickly propagate.Swyx [00:24:38]: Even if they're not published?Soumith [00:24:39]: Even if they're not published, the idea space is small. So even if they're not published, the discovery rate is going to be pretty high. It's not like we're talking about a combinatorial thing that is really large. You're talking about Llama style LLM models. And we're going to beat those to death on a few different hardware SKUs, right? Like it's not even we have a huge diversity of hardware you're going to aim to run it on. Now when you have such a narrow problem and you have a lot of people working on it, the rate at which these ideas are going to get figured out is going to be pretty rapid.Swyx [00:25:15]: Is it a standard bag of tricks? Like the standard one that I know of is, you know, fusing operators and-Soumith [00:25:22]: Yeah, it's the standard bag of tricks on figuring out how to improve your memory bandwidth and all that, yeah.Alessio [00:25:28]: Any ideas instead of things that are not being beaten to death that people should be paying more attention to?Novel PyTorch ApplicationsSwyx [00:25:34]: One thing I was like, you know, you have a thousand operators, right? Like what's the most interesting usage of PyTorch that you're seeing maybe outside of this little bubble?Soumith [00:25:41]: So PyTorch, it's very interesting and scary at the same time, but basically it's used in a lot of exotic ways, like from the ML angle, what kind of models are being built? And you get all the way from state-based models and all of these things to stuff nth order differentiable models, like neural ODEs and stuff like that. I think there's one set of interestingness factor from the ML side of things. And then there's the other set of interesting factor from the applications point of view. It's used in Mars Rover simulations, to drug discovery, to Tesla cars. And there's a huge diversity of applications in which it is used. So in terms of the most interesting application side of things, I think I'm scared at how many interesting things that are also very critical and really important it is used in. I think the scariest was when I went to visit CERN at some point and they said they were using PyTorch and they were using GANs at the same time for particle physics research. And I was scared more about the fact that they were using GANs than they were using PyTorch, because at that time I was a researcher focusing on GANs. But the diversity is probably the most interesting. How many different things it is being used in. I think that's the most interesting to me from the applications perspective. From the models perspective, I think I've seen a lot of them. Like the really interesting ones to me are where we're starting to combine search and symbolic stuff with differentiable models, like the whole AlphaGo style models is one example. And then I think we're attempting to do it for LLMs as well, with various reward models and search. I mean, I don't think PyTorch is being used in this, but the whole alpha geometry thing was interesting because again, it's an example of combining the symbolic models with the gradient based ones. But there are stuff like alpha geometry that PyTorch is used at, especially when you intersect biology and chemistry with ML. In those areas, you want stronger guarantees on the output. So yeah, maybe from the ML side, those things to me are very interesting right now.Swyx [00:28:03]: Yeah. People are very excited about the alpha geometry thing. And it's kind of like, for me, it's theoretical. It's great. You can solve some Olympia questions. I'm not sure how to make that bridge over into the real world applications, but I'm sure people smarter than me will figure it out.Synthetic Data vs Symbolic ModelsSoumith [00:28:18]: Let me give you an example of it. You know how the whole thing about synthetic data will be the next rage in LLMs is a thing?Swyx [00:28:27]: Already is a rage.Soumith [00:28:28]: Which I think is fairly misplaced in how people perceive it. People think synthetic data is some kind of magic wand that you wave and it's going to be amazing. Synthetic data is useful in neural networks right now because we as humans have figured out a bunch of symbolic models of the world or made up certain symbolic models because of human innate biases. So we've figured out how to ground particle physics in a 30 parameter model. And it's just very hard to compute as in it takes a lot of flops to compute, but it only has 30 parameters or so. I mean, I'm not a physics expert, but it's a very low rank model. We built mathematics as a field that basically is very low rank. Language, a deep understanding of language, like the whole syntactic parse trees and just understanding how language can be broken down and into a formal symbolism is something that we figured out. So we basically as humans have accumulated all this knowledge on these subjects, either synthetic, we created those subjects in our heads, or we grounded some real world phenomenon into a set of symbols. But we haven't figured out how to teach neural networks symbolic world models directly. The only way we have to teach them is generating a bunch of inputs and outputs and gradient dissenting over them. So in areas where we have the symbolic models and we need to teach all the knowledge we have that is better encoded in the symbolic models, what we're doing is we're generating a bunch of synthetic data, a bunch of input output pairs, and then giving that to the neural network and asking it to learn the same thing that we already have a better low rank model of in gradient descent in a much more over-parameterized way. Outside of this, like where we don't have good symbolic models, like synthetic data obviously doesn't make any sense. So synthetic data is not a magic wand where it'll work in all cases in every case or whatever. It's just where we as humans already have good symbolic models off. We need to impart that knowledge to neural networks and we figured out the synthetic data is a vehicle to impart this knowledge to. So, but people, because maybe they don't know enough about synthetic data as a notion, but they hear, you know, the next wave of data revolution is synthetic data. They think it's some kind of magic where we just create a bunch of random data somehow. They don't think about how, and then they think that's just a revolution. And I think that's maybe a gap in understanding most people have in this hype cycle.Swyx [00:31:23]: Yeah, well, it's a relatively new concept, so. Oh, there's two more that I'll put in front of you and then you can see what you respond. One is, you know, I have this joke that it's, you know, it's only synthetic data if it's from the Mistral region of France, otherwise it's just a sparkling distillation, which is what news research is doing. Like they're distilling GPT-4 by creating synthetic data from GPT-4, creating mock textbooks inspired by Phi 2 and then fine tuning open source models like Llama. And so I don't know, I mean, I think that's, should we call that synthetic data? Should we call it something else? I don't know.Soumith [00:31:57]: Yeah, I mean, the outputs of LLMs, are they synthetic data? They probably are, but I think it depends on the goal you have. If your goal is you're creating synthetic data with the goal of trying to distill GPT-4's superiority into another model, I guess you can call it synthetic data, but it also feels like disingenuous because your goal is I need to copy the behavior of GPT-4 and-Swyx [00:32:25]: It's also not just behavior, but data set. So I've often thought of this as data set washing. Like you need one model at the top of the chain, you know, unnamed French company that has that, you know, makes a model that has all the data in it that we don't know where it's from, but it's open source, hey, and then we distill from that and it's great. To be fair, they also use larger models as judges for preference ranking, right? So that is, I think, a very, very accepted use of synthetic.Soumith [00:32:53]: Correct. I think it's a very interesting time where we don't really have good social models of what is acceptable depending on how many bits of information you use from someone else, right? It's like, okay, you use one bit. Is that okay? Yeah, let's accept it to be okay. Okay, what about if you use 20 bits? Is that okay? I don't know. What if you use 200 bits? I don't think we as society have ever been in this conundrum where we have to be like, where is the boundary of copyright or where is the boundary of socially accepted understanding of copying someone else? We haven't been tested this mathematically before,Swyx [00:33:38]: in my opinion. Whether it's transformative use. Yes. So yeah, I think this New York Times opening eye case is gonna go to the Supreme Court and we'll have to decide it because I think we never had to deal with it before. And then finally, for synthetic data, the thing that I'm personally exploring is solving this great stark paradigm difference between rag and fine tuning, where you can kind of create synthetic data off of your retrieved documents and then fine tune on that. That's kind of synthetic. All you need is variation or diversity of samples for you to fine tune on. And then you can fine tune new knowledge into your model. I don't know if you've seen that as a direction for synthetic data.Soumith [00:34:13]: I think you're basically trying to, what you're doing is you're saying, well, language, I know how to parametrize language to an extent. And I need to teach my model variations of this input data so that it's resilient or invariant to language uses of that data.Swyx [00:34:32]: Yeah, it doesn't overfit on the wrong source documents.Soumith [00:34:33]: So I think that's 100% synthetic. You understand, the key is you create variations of your documents and you know how to do that because you have a symbolic model or like some implicit symbolic model of language.Swyx [00:34:48]: Okay.Alessio [00:34:49]: Do you think the issue with symbolic models is just the architecture of the language models that we're building? I think maybe the thing that people grasp is the inability of transformers to deal with numbers because of the tokenizer. Is it a fundamental issue there too? And do you see alternative architectures that will be better with symbolic understanding?Soumith [00:35:09]: I am not sure if it's a fundamental issue or not. I think we just don't understand transformers enough. I don't even mean transformers as an architecture. I mean the use of transformers today, like combining the tokenizer and transformers and the dynamics of training, when you show math heavy questions versus not. I don't have a good calibration of whether I know the answer or not. I, you know, there's common criticisms that are, you know, transformers will just fail at X. But then when you scale them up to sufficient scale, they actually don't fail at that X. I think there's this entire subfield where they're trying to figure out these answers called like the science of deep learning or something. So we'll get to know more. I don't know the answer.Meta AI and Llama 2/3Swyx [00:35:57]: Got it. Let's touch a little bit on just Meta AI and you know, stuff that's going on there. Maybe, I don't know how deeply you're personally involved in it, but you're our first guest with Meta AI, which is really fantastic. And Llama 1 was, you know, you are such a believer in open source. Llama 1 was more or less the real breakthrough in open source AI. The most interesting thing for us covering on this, in this podcast was the death of Chinchilla, as people say. Any interesting insights there around the scaling models for open source models or smaller models or whatever that design decision was when you guys were doing it?Soumith [00:36:31]: So Llama 1 was Guillaume Lample and team. There was OPT before, which I think I'm also very proud of because we bridged the gap in understanding of how complex it is to train these models to the world. Like until then, no one really in gory detail published.Swyx [00:36:50]: The logs.Soumith [00:36:51]: Yeah. Like, why is it complex? And everyone says, oh, it's complex. But no one really talked about why it's complex. I think OPT was cool.Swyx [00:37:02]: I met Susan and she's very, very outspoken. Yeah.Soumith [00:37:05]: We probably, I think, didn't train it for long enough, right? That's kind of obvious in retrospect.Swyx [00:37:12]: For a 175B. Yeah. You trained it according to Chinchilla at the time or?Soumith [00:37:17]: I can't remember the details, but I think it's a commonly held belief at this point that if we trained OPT longer, it would actually end up being better. Llama 1, I think, was Guillaume Lample and team Guillaume is fantastic and went on to build Mistral. I wasn't too involved in that side of things. So I don't know what you're asking me, which is how did they think about scaling loss and all of that? Llama 2, I was more closely involved in. I helped them a reasonable amount with their infrastructure needs and stuff. And Llama 2, I think, was more like, let's get to the evolution. At that point, we kind of understood what we were missing from the industry's understanding of LLMs. And we needed more data and we needed more to train the models for longer. And we made, I think, a few tweaks to the architecture and we scaled up more. And that was Llama 2. I think Llama 2, you can think of it as after Guillaume left, the team kind of rebuilt their muscle around Llama 2. And Hugo, I think, who's the first author is fantastic. And I think he did play a reasonable big role in Llama 1 as well.Soumith [00:38:35]: And he overlaps between Llama 1 and 2. So in Llama 3, obviously, hopefully, it'll be awesome.Alessio [00:38:42]: Just one question on Llama 2, and then we'll try and fish Llama 3 spoilers out of you. In the Llama 2 paper, the loss curves of the 34 and 70B parameter, they still seem kind of steep. Like they could go lower. How, from an infrastructure level, how do you allocate resources? Could they have just gone longer or were you just, hey, this is all the GPUs that we can burn and let's just move on to Llama 3 and then make that one better?Soumith [00:39:07]: Instead of answering specifically about that Llama 2 situation or whatever, I'll tell you how we think about things. Generally, we're, I mean, Mark really is some numbers, right?Swyx [00:39:20]: So let's cite those things again. All I remember is like 600K GPUs.Soumith [00:39:24]: That is by the end of this year and 600K H100 equivalents. With 250K H100s, including all of our other GPU or accelerator stuff, it would be 600-and-something-K aggregate capacity.Swyx [00:39:38]: That's a lot of GPUs.Soumith [00:39:39]: We'll talk about that separately. But the way we think about it is we have a train of models, right? Llama 1, 2, 3, 4. And we have a bunch of GPUs. I don't think we're short of GPUs. Like-Swyx [00:39:54]: Yeah, no, I wouldn't say so. Yeah, so it's all a matter of time.Soumith [00:39:56]: I think time is the biggest bottleneck. It's like, when do you stop training the previous one and when do you start training the next one? And how do you make those decisions? The data, do you have net new data, better clean data for the next one in a way that it's not worth really focusing on the previous one? It's just a standard iterative product. You're like, when is the iPhone 1? When do you start working on iPhone 2? Where is the iPhone? And so on, right? So mostly the considerations are time and generation, rather than GPUs, in my opinion.Alessio [00:40:31]: So one of the things with the scaling loss, like Chinchilla is optimal to balance training and inference costs. I think at Meta's scale, you would rather pay a lot more maybe at training and then save on inference. How do you think about that from infrastructure perspective? I think in your tweet, you say you can try and guess on like how we're using these GPUs. Can you just give people a bit of understanding? It's like, because I've already seen a lot of VCs say, Llama 3 has been trained on 600,000 GPUs and that's obviously not true, I'm sure. How do you allocate between the research, FAIR and the Llama training, the inference on Instagram suggestions that get me to scroll, like AI-generated stickers on WhatsApp and all of that?Soumith [00:41:11]: Yeah, we haven't talked about any of this publicly, but as a broad stroke, it's like how we would allocate resources of any other kinds at any company. You run a VC portfolio, how do you allocate your investments between different companies or whatever? You kind of make various trade-offs and you kind of decide, should I invest in this project or this other project, or how much should I invest in this project? It's very much a zero sum of trade-offs. And it also comes into play, how are your clusters configured, like overall, what you can fit of what size and what cluster and so on. So broadly, there's no magic sauce here. I mean, I think the details would add more spice, but also wouldn't add more understanding. It's just gonna be like, oh, okay, I mean, this looks like they just think about this as I would normally do.Alessio [00:42:05]: So even the GPU rich run through the same struggles of having to decide where to allocate things.Soumith [00:42:11]: Yeah, I mean, at some point I forgot who said it, but you kind of fit your models to the amount of compute you have. If you don't have enough compute, you figure out how to make do with smaller models. But no one as of today, I think would feel like they have enough compute. I don't think I've heard any company within the AI space be like, oh yeah, like we feel like we have sufficient compute and we couldn't have done better. So that conversation, I don't think I've heard from any of my friends at other companies.EleutherSwyx [00:42:47]: Stella from Eleuther sometimes says that because she has a lot of donated compute. She's trying to put it to interesting uses, but for some reason she's decided to stop making large models.Soumith [00:42:57]: I mean, that's a cool, high conviction opinion that might pay out.Swyx [00:43:01]: Why?Soumith [00:43:02]: I mean, she's taking a path that most people don't care to take about in this climate and she probably will have very differentiated ideas. I mean, think about the correlation of ideas in AI right now. It's so bad, right? So everyone's fighting for the same pie. In some weird sense, that's partly why I don't really directly work on LLMs. I used to do image models and stuff and I actually stopped doing GANs because GANs were getting so hot that I didn't have any calibration of whether my work would be useful or not because, oh yeah, someone else did the same thing you did. It's like, there's so much to do, I don't understand why I need to fight for the same pie. So I think Stella's decision is very smart.Making BetsAlessio [00:43:53]: And how do you reconcile that with how we started the discussion about intrinsic versus extrinsic kind of like accomplishment or success? How should people think about that especially when they're doing a PhD or early in their career? I think in Europe, I walked through a lot of the posters and whatnot, there seems to be mode collapse in a way in the research, a lot of people working on the same things. Is it worth for a PhD to not take a bet on something that is maybe not as interesting just because of funding and visibility and whatnot? Or yeah, what suggestions would you give?Soumith [00:44:28]: I think there's a baseline level of compatibility you need to have with the field. Basically, you need to figure out if you will get paid enough to eat, right? Like whatever reasonable normal lifestyle you want to have as a baseline. So you at least have to pick a problem within the neighborhood of fundable. Like you wouldn't wanna be doing something so obscure that people are like, I don't know, like you can work on it.Swyx [00:44:59]: Would a limit on fundability, I'm just observing something like three months of compute, right? That's the top line, that's the like max that you can spend on any one project.Soumith [00:45:09]: But like, I think that's very ill specified, like how much compute, right? I think that the notion of fundability is broader. It's more like, hey, are these family of models within the acceptable set of, you're not crazy or something, right? Even something like neural or DS, which is a very boundary pushing thing or states-based models or whatever. Like all of these things I think are still in fundable territory. When you're talking about, I'm gonna do one of the neuromorphic models and then apply image classification to them or something, then it becomes a bit questionable. Again, it depends on your motivation. Maybe if you're a neuroscientist, it actually is feasible. But if you're an AI engineer, like the audience of these podcasts, then it's more questionable. The way I think about it is, you need to figure out how you can be in the baseline level of fundability just so that you can just live. And then after that, really focus on intrinsic motivation and depends on your strengths, like how you can play to your strengths and your interests at the same time. Like I try to look at a bunch of ideas that are interesting to me, but also try to play to my strengths. I'm not gonna go work on theoretical ML. I'm interested in it, but when I want to work on something like that, I try to partner with someone who is actually a good theoretical ML person and see if I actually have any value to provide. And if they think I do, then I come in. So I think you'd want to find that intersection of ideas you like, and that also play to your strengths. And I'd go from there. Everything else, like actually finding extrinsic success and all of that, I think is the way I think about it is like somewhat immaterial. When you're talking about building ecosystems and stuff, slightly different considerations come into play, but that's a different conversation.Swyx [00:47:06]: We're gonna pivot a little bit to just talking about open source AI. But one more thing I wanted to establish for Meta is this 600K number, just kind of rounding out the discussion, that's for all Meta. So including your own inference needs, right? It's not just about training.Soumith [00:47:19]: It's gonna be the number in our data centers for all of Meta, yeah.Swyx [00:47:23]: Yeah, so there's a decent amount of workload serving Facebook and Instagram and whatever. And then is there interest in like your own hardware?MTIASoumith [00:47:31]: We already talked about our own hardware. It's called MTIA. Our own silicon, I think we've even showed the standard photograph of you holding the chip that doesn't work. Like as in the chip that you basically just get like-Swyx [00:47:51]: As a test, right?Soumith [00:47:52]: Yeah, a test chip or whatever. So we are working on our silicon and we'll probably talk more about it when the time is right, but-Swyx [00:48:00]: Like what gaps do you have that the market doesn't offer?Soumith [00:48:04]: Okay, I mean, this is easy to answer. So basically, remember how I told you about there's this memory hierarchy and like sweet spots and all of that? Fundamentally, when you build a hardware, you make it general enough that a wide set of customers and a wide set of workloads can use it effectively while trying to get the maximum level of performance they can. The more specialized you make the chip, the more hardware efficient it's going to be, the more power efficient it's gonna be, the more easier it's going to be to find the software, like the kernel's right to just map that one or two workloads to that hardware and so on. So it's pretty well understood across the industry that if you have a sufficiently large volume, enough workload, you can specialize it and get some efficiency gains, like power gains and so on. So the way you can think about everyone building, every large company building silicon, I think a bunch of the other large companies are building their own silicon as well, is they, each large company has a sufficient enough set of verticalized workloads that can be specialized that have a pattern to them that say a more generic accelerator like an NVIDIA or an AMD GPU does not exploit. So there is some level of power efficiency that you're leaving on the table by not exploiting that. And you have sufficient scale and you have sufficient forecasted stability that those workloads will exist in the same form, that it's worth spending the time to build out a chip to exploit that sweet spot. Like obviously something like this is only useful if you hit a certain scale and that your forecasted prediction of those kind of workloads being in the same kind of specializable exploitable way is true. So yeah, that's why we're building our own chips.Swyx [00:50:08]: Awesome.Open Source AIAlessio [00:50:09]: Yeah, I know we've been talking a lot on a lot of different topics and going back to open source, you had a very good tweet. You said that a single company's closed source effort rate limits against people's imaginations and needs. How do you think about all the impact that some of the Meta AI work in open source has been doing and maybe directions of the whole open source AI space?Soumith [00:50:32]: Yeah, in general, I think first, I think it's worth talking about this in terms of open and not just open source, because like with the whole notion of model weights, no one even knows what source means for these things. But just for the discussion, when I say open source, you can assume it's just I'm talking about open. And then there's the whole notion of licensing and all that, commercial, non-commercial, commercial with clauses and all that. I think at a fundamental level, the most benefited value of open source is that you make the distribution to be very wide. It's just available with no friction and people can do transformative things in a way that's very accessible. Maybe it's open source, but it has a commercial license and I'm a student in India. I don't care about the license. I just don't even understand the license. But like the fact that I can use it and do something with it is very transformative to me. Like I got this thing in a very accessible way. And then it's various degrees, right? And then if it's open source, but it's actually a commercial license, then a lot of companies are gonna benefit from gaining value that they didn't previously have, that they maybe had to pay a closed source company for it. So open source is just a very interesting tool that you can use in various ways. So there's, again, two kinds of open source. One is some large company doing a lot of work and then open sourcing it. And that kind of effort is not really feasible by say a band of volunteers doing it the same way. So there's both a capital and operational expenditure that the large company just decided to ignore and give it away to the world for some benefits of some kind. They're not as tangible as direct revenue. So in that part, Meta has been doing incredibly good things. They fund a huge amount of the PyTorch development. They've open sourced Llama and those family of models and several other fairly transformative projects. FICE is one, Segment Anything, Detectron, Detectron 2. Dense Pose. I mean, it's-Swyx [00:52:52]: Seamless. Yeah, seamless.Soumith [00:52:53]: Like it's just the list is so long that we're not gonna cover. So I think Meta comes into that category where we spend a lot of CapEx and OpEx and we have a high talent density of great AI people and we open our stuff. And the thesis for that, I remember when FAIR was started, the common thing was like, wait, why would Meta wanna start a open AI lab? Like what exactly is a benefit from a commercial perspective? And for then the thesis was very simple. It was AI is currently rate limiting Meta's ability to do things. Our ability to build various product integrations, moderation, various other factors. Like AI was the limiting factor and we just wanted AI to advance more and we didn't care if the IP of the AI was uniquely in our possession or not. However the field advances, that accelerates Meta's ability to build a better product. So we just built an open AI lab and we said, if this helps accelerate the progress of AI, that's strictly great for us. But very easy, rational, right? Still the same to a large extent with the Llama stuff. And it's the same values, but the argument, it's a bit more nuanced. And then there's a second kind of open source, which is, oh, we built this project, nights and weekends and we're very smart people and we open sourced it and then we built a community around it. This is the Linux kernel and various software projects like that. So I think about open source, like both of these things being beneficial and both of these things being different. They're different and beneficial in their own ways. The second one is really useful when there's an active arbitrage to be done. If someone's not really looking at a particular space because it's not commercially viable or whatever, like a band of volunteers can just coordinate online and do something and then make that happen. And that's great.Open Source LLMsI wanna cover a little bit about open source LLMs maybe. So open source LLMs have been very interesting because I think we were trending towards an increase in open source in AI from 2010 all the way to 2017 or something. Like where more and more pressure within the community was to open source their stuff so that their methods and stuff get adopted. And then the LLMs revolution kind of took the opposite effect OpenAI stopped open sourcing their stuff and DeepMind kind of didn't, like all the other cloud and all these other providers, they didn't open source their stuff. And it was not good in the sense that first science done in isolation probably will just form its own bubble where people believe their own b******t or whatever. So there's that problem. And then there was the other problem which was the accessibility part. Like, okay, I again always go back to I'm a student in India with no money. What is my accessibility to any of these closers models? At some scale I have to pay money. That makes it a non-starter and stuff. And there's also the control thing. I strongly believe if you want human aligned stuff, you want all humans to give feedback. And you want all humans to have access to that technology in the first place. And I actually have seen, living in New York, whenever I come to Silicon Valley, I see a different cultural bubble. Like all the friends I hang out with talk about some random thing like Dyson Spheres or whatever, that's a thing. And most of the world doesn't know or care about any of this stuff. It's definitely a bubble and bubbles can form very easily. And when you make a lot of decisions because you're in a bubble, they're probably not globally optimal decisions. So I think open source, the distribution of open source powers a certain kind of non-falsifiability that I think is very important. I think on the open source models, like it's going great in the fact that LoRa I think came out of the necessity of open source models needing to be fine-tunable in some way. Yeah, and I think DPO also came out of the academic open source side of things. So do any of the closed source labs, did any of them already have LoRa or DPO internally? Maybe, but that does not advance humanity in any way. It advances some companies probability of doing the winner takes all that I talked about earlier in the podcast.Open Source and TrustI don't know, it just feels fundamentally good. Like when people try to, you know, people are like, well, what are the ways in which it is not okay? I find most of these arguments, and this might be a little controversial, but I find a lot of arguments based on whether closed source models are safer or open source models are safer very much related to what kind of culture they grew up in, what kind of society they grew up in. If they grew up in a society that they trusted, then I think they take the closed source argument. And if they grew up in a society that they couldn't trust, where the norm was that you didn't trust your government, obviously it's corrupt or whatever, then I think the open source argument is what they take. I think there's a deep connection to like people's innate biases from their childhood and their trust in society and governmental aspects that push them towards one opinion or the other. And I'm definitely in the camp of open source is definitely going to actually have better outcomes for society. Closed source to me just means that centralization of power, which, you know, is really hard to trust. So I think it's going well
Kyle Mathews is Co-Founder & CTO of Gatsby, the front-end web development platform. Their open source framework, GatsbyJS, is widely adopted with 55K GitHub Stars. In Feb 2023, Gatsby was acquired by Netlify. In this episode, we discuss how GatsbyJS was able to grow incredibly fast, what features matter most for front-end development frameworks (speed, approachability, etc.), learnings from going after a smaller portion of the market and over-hiring & more!
Uncomfortable Conversations Podcast The Untold Stories of the 3HO Kundalini Yoga Community
Guruganesha Singh grew up in Natick MA, 18 miles west of Boston. Graduated high school class of 1968. In Feb 1972, he attended his first KY class at the boathouse at Smith College in Northampton MA, after seeing YB's tratakum pic on a flyer in the Student Union. Soon after he attended a weekend KY retreat at the Montague (montagew) in MA with Guru Shabd Singh. Very soon thereafter he moved into the Worcester (wooster)MA Ashram near Clark University where he was a senior year in college. The first week of Jan. 1973 he moved into Ahimsa Ashram on Q St. Washington DC, where he became the dishwasher at the Golden Temple Restaurant for many months. He was arranged to be married to Gurudarshan Kaur originally from Tuscon, AZ in May 1977. They had their son, Akal Sahai Singh in April 1978. In 1986, at age 8 years old, their son, Akal Sahai went to India for one year, which caused a rupture in their marriage, and Gurudarshan started divorce proceedings. In 1987 it was suggested by the head of the ashram that he marry Mata Mandir Kaur from the DC ashram (and they are still married 36 years later) He worked for all the DC ashram businesses thru 1989 when he resigned from the “family” businesses and started his own business Sandler Sales Institute, a sales training in the high-tech sector. In Jan 2000 he founded Spirit Voyage Music and started recording and touring with Snatam Kaur for the next 11 years. In 2011 he and Snatam parted ways, and he started the GuruGanesha Band. The revelations of 2020 exposed what he held at a distance for far too long, and the undeniable truth created a landslide. Most recently, after the results of the 2023 Siri Singh Sahib Corporation (SSSC) elections, he immediately resigned from Khalsa Council and the Sikh Dharma Ministry. Song credit: Landslide by Guruganesha Listen to the Uncomfortable Conversations Spotify Playlist: ________________________ To be a guest on the podcast, please email GN@GuruNischan.com Follow my work at www.GuruNischan.com Contribute to this work at www.paypal.me/gurunischanllc Book website link to Under the Yoga Mat: The Dark History of Yogi Bhajan's Kundalini Yoga - https://www.undertheyogamat.com/
In Feb 2023, the USPSTF recommended that clinicians “screen for hypertensive disorders of pregnancy”. Specifically, they stated that “measuring blood pressure at each prenatal visit is the best approach”. Mind blowing I know.
GETTIN' SALTY EXPERIENCE PODCAST Ep.147 - Our special guest will be 22 year FDNY veteran Jules Ellison. He went to proby school in 1989 and was assigned to Engine 224. In Feb 1996 he transferred to Engine 236 (Kubes
Deep Dive: the ongoing mRNA Nuremberg Files and Rochelle Walensky's giggling mouth full of lies.We continue to cover the Covid PsyOp because it's not about Covid, it's about the war of our age, the battle to maintain our status as people made in God's image against the technocratic medical state that wants us remade in their own image. Arrogance and inhumanity: In August of 2020, Pfizer apparently knew about 1.6 million adverse events covering nearly every organ system. Arrogance and ignorance: The CDC's least believable liar, “Doctor” Rochelle Walensky giggles as she pretends the VAERS database is where people report their loved ones dying in car accidents. Arrogance and mockery: German news reports a 17-year-old girl was paralyzed by the Pfizer COVID vaccine. The state gov't confirms “vaccine” caused it and pays her 854 euros/month. However, the family is denied compensation from the manufacturer due to its liability immunity. Arrogance and brutality: Turbo gastric cancer - diagnosis to death in 12 days, tragic story of a 49 yo army nurse Billie-Joe Graham - many cases are being reported after Pfizer & Moderna COVID-19 mRNA vaccination! Arrogance and totality: the federal HHS intends to use your money to force your state to mutilate your kids. But, praise God, there is a change coming. A slow, steady change. A Canadian Provincial Premier apologizes to the “unvaccinated”, “we were wrong…” She makes an unprecedented promise…What does God's Word say? Ephesians 2:10 For we are his workmanship, created in Christ Jesus for good works, which God prepared beforehand, that we should walk in them.Mark 9:42 Causing to Stumble42 “If anyone causes one of these little ones—those who believe in me—to stumble, it would be better for them if a large millstone were hung around their neck and they were thrown into the sea.Episode 903 Links:German news reports a 17-year-old girl was paralyzed by the Pfizer COVID vaccine. The state gov't confirms “vaccine” caused it and pays her 854 euros/month. However, the family is denied compensation from the manufacturer due to its liability immunity.Horowitz: Confidential Pfizer document shows the company observed 1.6 million adverse events covering nearly every organ systemTurbo gastric cancer - diagnosis to death in 12 days, tragic story of a 49 yo army nurse Billie-Joe Graham - many cases are being reported after Pfizer & Moderna COVID-19 mRNA vaccination!In Feb. 2020, after a @nypost oped said COVID-19 came from a lab leak, Facebook censored the story. Why? Because "independent fact-checkers" said it was "False information." Not only was it true information, one of the fact-checkers had worked at the labTop Canadian politician apologizes to “unvaccinated”, “we were wrong…” She makes an unprecedented promise…MTG Asks CDC Director Rochelle Walensky Which Vaccine Company She is Going to Work For After She Leaves the CDC - "Now that you're going to be leaving the CDC pretty soon, what job are you going to take? Are you going to be on the board of either Pfizer or Moderna because you've done one hell of a job in making sure that they've made a lot of money."Miami Mayor Francis Suarez says Florida Gov. Ron DeSantis' lifting of Covid-19 business restrictions and suspending fines will “have a huge impact.” “This is a very dangerous time … He's taken some of these decisions out of our ability to regulate them.” Nadler: "When we have a pandemic like covid, two-year-olds should have been required to wear masks. It would be child abuse for parents not to do that."Rep. @chiproytx responded by ripping Nadler a new one: "My Democratic colleagues [believe] the full power of the federal government should be part of ensuring and forcing your two year old child to be masked."Congressman @RepJimBanks: "So you would withhold hospital grants from states like mine that ban transgender sex reassignment surgeries for minors?” HHS Secretary @SecBecerra: “We will-"4Patriots https://4patriots.com Protect your family with Food kits, solar generators and more at 4Patriots. Use code TODD for 10% off your first purchase. Alan's Soaps https://alanssoaps.com/TODD Use coupon code ‘TODD' to save an additional 10% off the bundle price. BiOptimizers https://magbreakthrough.com/todd Use promo code TODD for 10% off your order. Bonefrog https://bonefrog.us Enter promo code TODD at checkout to receive 10% off your subscription. Bulwark Capital http://KnowYourRiskRadio.com Find out how Bulwark Capital Actively Manages risk. Call 866-779-RISK or vist KnowYourRiskRadio.com Healthycell http://healthycell.com/todd Protect your heart with Healthycell! Use promo code TODD for 20% off your first order. My Pillow https://mypillow.com Use code TODD for BOGO on the new MyPillow 2.0 Patriot Mobile https://patriotmobile.com/herman Get free activation today with offer code HERMAN. Visit or call 878-PATRIOT. RuffGreens https://ruffgreens.com/todd Get your FREE Jumpstart Trial Bag of Ruff Greens, simply cover shipping. Visit or call 877-MYDOG-64. SOTA Weight Loss https://sotaweightloss.com SOTA Weight Loss is, say it with me now, STATE OF THE ART! GreenHaven Interactive https://greenhaveninteractive.com Digital Marketing including search engine optimization and website design.This show is part of the Spreaker Prime Network, if you are interested in advertising on this podcast, contact us at https://www.spreaker.com/show/5674544/advertisement
Ron Bennington joins me on today's episode to talk about interview style and his series Unmasked. Ron is someone whose style of interviewing I've always admired, it seemed fitting to have him on this episode. All upcoming shows & tickets: https://www.arishaffir.com/tour Instagram: https://www.instagram.com/arishaffir/ • Oct 21: Bensalem, PA • Nov 02: Minneapolis, MN • Nov 01: Omaha, NE • Nov 03: Madison, WIi • Nov 04: Chicago, IL • Nov 05: Iowa City, IA • Nov 09: Kansas City, MO • Nov 07: Tulsa, OK • Nov 09: Kansas City, MO • Nov 10: Louisville, KY • Nov 11: Indianapolis, IN • Nov 12: St. Louis, MO • Nov 13: Ft. Wayne, IN • Feb 02: Boston, MA • Feb 03: Foxwoods, CT For more Ron Bennington: https://benningtonshow.com/#Links The Music: Buggles - Video Killed The Radio Star Talk Talk- Talk Talk The Doors - The End Simple Minds - Don't You Forget About Me
Sometimes you find yourself in the tropical paradise of Costa Rica, training jiu jitsu and spending time with close friends. These are some of my favorite trips, and I always enjoy sitting down and picking the brains of experts in their fields. As usual, we covered much more than just jiu jitsu, as these conversations often span life in general, and business. We definitely spent some time unpacking and discussing the recent multi-million dollar injury award to a white belt jiu jitsu student, and the potential impact on jiu jitsu academies worldwide. Henry Akins began training in Brazilian Jiu-jitsu in 1995 at the Rickson Gracie Acadamy on Pico blvd in West LA. Shortly after he started he became the secretary at the acadamy and was spending 70 hours a week there watching and participating in all of the classes. In 1997 the school moved to the Pacific Palisades where Henry trained and assisted main Instructor Luis Heredia and also participated and trained in all of the classes taught by Rickson. In Feb of 2004, because of his persistence and dedication to the fundamentals and philosophies of Jiu-Jitsu, Rickson Gracie presented Henry with a blackbelt, being only the third American at the time to receive that honor. Dan Hart is a Henry Akins blackbelt and owner of Alpha BJJ in Woodstock, IL. In addition to coaching both at his academy and seminars all over the world he is also an accomplished restauranteur and entrepreneur. The link to Henry's upcoming Sedona Seminar https://www.hiddenjiujitsucamps.com/cottonwood-sedona-az The link to Dan and Leah's upcoming seminar in Ireland: https://www.wetravel.com/trips/pilgrim-ireland-pilgrim-jiujitsu-ballycastle-87008723
“Nope! The injunction stays and you're still restrained.”___P was a member of an international group of leasing businesses. D was a former employee of P's.Commencing in 2016, over time D had been promoted, signing a number of new employment agreements: [4]The most recent contract included a 3 month notice period and an extended restraint. D gave evidence they did not read the agreement before signing, instead relying on emails exchanged at the time: [7], [8]In Feb 2023, D resigned purporting to give 4 weeks notice: [9]After D left employment, P was granted an injunction restraining D from competing with P for a period: [2]D entered into an employment agreement with a competing property business purporting to commence in March 2023: [12]D's resignation was a repudiation that P could accept and terminate, or otherwise keep on foot. P kept the contract on foot: [13]The “garden leave” requirements that P said remained on foot were indeed restraints of trade: [13]In early March P got an injunction preventing D from working for their new employer: [15]D sought to discharge the injunction: [16]P had to show the restraint was reasonably necessary to protect its legitimate interests, and otherwise compliant with the NSW legislation: [20] - [22]D said they were not bound by the 3 month notice period as they had not read the document (legally immaterial where a person has signed a document known by them to include contractual terms): [25]P said D had developed personal relationships, had access to confidential information like tenders and pricing, and may take 12 months to properly replace: [28]D said head hunting was common among the small industry and the restraint was unnecessary: [30]P established a serious question to be tried due to (i) D's senior status and personal relationships, (ii) that (at least) 3 months might be needed to onboard a replacement for D, and (iii) P had a legitimate interest in protecting the confidential information D was aware of: [31] - [33]Noting D's role with P there was a risk damages were not an appropriate remedy: [35]While D might face some financial risk, P undertook to continue paying their salary, nor was there evidence that D's signon bonus was at risk: [36], [37]That protected D's position: [42]D “was the author of (their) own misfortunes” by entering into an arrangement with a new employer in breach of their previous obligations: [40]The balance of convenience favoured the maintenance of the injunction: [43]The application to dismiss the injunction failed. D remained bound by it: [45]
It's time for the Comic Talk Headlines with Generally Nerdy! House of the Dragon release window, Monsterverse confirmations, New trailers, and all the things.Tune in Wednesdays for the regular show and Saturdays for the re-post of the Friday night LIVE SHOW. Plus, don't forget to subscribe for more fresh content. Episodic ShowsFollow-ups/CorrectionsHouse of the Dragon - Casey Bloys, Content CEO at HBO Max, has confirmed that season 2 won't air till summer of 2024. https://variety.com/2023/tv/news/house-of-the-dragon-season-2-summer-2024-return-1235531050/ Dune: Sisterhood - Johan Renck, executive producer of Chernobyl, has stepped down from his role as director of the first two episodes of the project, causing a delay in production while a replacement is sought. Actress Shirley Henderson has also left the project, and her role as Tula Harkonnen will need to be recast https://deadline.com/2023/02/dune-the-sisterhood-director-johan-renck-shirley-henderson-exit-hbo-max-series-1235273486/ MonsterVerse - The Apple TV+ series WILL have crossover with the Legendary movie universe. https://boundingintocomics.com/2023/02/25/godzilla-a-big-part-of-apple-tvs-monsterverse-series-that-is-set-to-introduce-new-titans-and-overlap-with-adam-wingard-helmed-film/ TrailersShadow and Bone - https://youtu.be/0dOmcdz-PN0 Season 2 trailer. Shadow and Bone Season 2 premieres March 16, 2023 only on Netflix.Walking Dead: Dead City - https://youtu.be/84orr4NP9Vc Maybe our first hint as to why Maggie and Negan decide to team up? Has something to do with Hershel.Ted Lasso S03 - https://youtu.be/IR9yjn7Lkdg Ted Lasso Season 3 premieres March 15 on Apple TV+Reg ‘ol NewsRichard Belzer - The Law and Order star passed at 78. Cause of death has not been released.YouTube Anime - Viz Media, who own the rights to Naruto seasons one to eight, the complete Sailor Moon series, Hunter x Hunter seasons one to three, all 37 episodes of Death Note, the complete Inuyasha series, and seasons one and two of Mr. Osomatsu, will be making their catalog available for free on YouTube in the US.https://wegotthiscovered.com/anime/you-can-now-watch-death-note-naruto-sailor-moon-and-more-for-free-on-youtube/ SuggestsMando S03E01MoviesFollow-ups/CorrectionsSuper Mario Bros - bumped up to April 5th from the 7th for Easter weekend. Turns the opening into a 5-day affair. https://deadline.com/2023/02/super-mario-bros-movie-release-date-spring-1235274239/ Reg ‘ol NewsJoker 2 - Lady Gaga getting sued! In Feb 2021 Gaga's dogs were kidnapped. One of the 5 people who has been arrested and charged with the kidnapping, is now suing Gaga for not paying the $500,000 reward. Breach of Contract and Fraud.https://comicbook.com/movies/news/joker-folie-a-deux-lady-gaga-sued-for-not-paying-ransom-money-to-person-who-kidnapped-her-dogs/ SuggestsBirdman or (The Unexpected Virtue of Ignorance) - 2014 Michael Keaton Zach Galifianakis Edward Norton Andrea Riseborough Amy Ryan Emma Stone Naomi WattsRumor MillConfirmations/RefutationsFortnite - Creed skins were real… called it! https://comicbook.com/gaming/news/fortnite-new-creed-3-skins/ New SourcesThe Flash - George Clooney is that rumored “other” Batman… and he is going to be in the post-credit stinger.New RumorsWonderMan - Yayah Abdul-Mateen II's series set for Disney+ is rumored to be casting Ed Harris in the role of the villain. Which villain that is still remains to be seen.Loki S02 - Kang is rumored to be appearing in NO LESS than 3 episodes of the sequel series.Stones V Beatles - Paul McCartney and Ringo rumored to be making guest appearances on the next Rolling Stones album? 2021 Grammy producer of the year Andrew Watt, who won for his work with Ozzy Osborne, will be producing the album for the band.Eternals - The sequel that no one wants is reportedly being added to the Marvel slate…Secret Wars - Series rumored to be debuting in MayDeadpool 3 - Hugh Jackman playing MULTIPLE variants of Wolverine in the movie?You can support this show by visiting our merch store, or by leaving us an Apple Podcasts review.
It's time for the Comic Talk Headlines with Generally Nerdy! House of the Dragon release window, Monsterverse confirmations, New trailers, and all the things.Tune in Wednesdays for the regular show and Saturdays for the re-post of the Friday night LIVE SHOW. Plus, don't forget to subscribe for more fresh content. Episodic ShowsFollow-ups/CorrectionsHouse of the Dragon - Casey Bloys, Content CEO at HBO Max, has confirmed that season 2 won't air till summer of 2024. https://variety.com/2023/tv/news/house-of-the-dragon-season-2-summer-2024-return-1235531050/ Dune: Sisterhood - Johan Renck, executive producer of Chernobyl, has stepped down from his role as director of the first two episodes of the project, causing a delay in production while a replacement is sought. Actress Shirley Henderson has also left the project, and her role as Tula Harkonnen will need to be recast https://deadline.com/2023/02/dune-the-sisterhood-director-johan-renck-shirley-henderson-exit-hbo-max-series-1235273486/ MonsterVerse - The Apple TV+ series WILL have crossover with the Legendary movie universe. https://boundingintocomics.com/2023/02/25/godzilla-a-big-part-of-apple-tvs-monsterverse-series-that-is-set-to-introduce-new-titans-and-overlap-with-adam-wingard-helmed-film/ TrailersShadow and Bone - https://youtu.be/0dOmcdz-PN0 Season 2 trailer. Shadow and Bone Season 2 premieres March 16, 2023 only on Netflix.Walking Dead: Dead City - https://youtu.be/84orr4NP9Vc Maybe our first hint as to why Maggie and Negan decide to team up? Has something to do with Hershel.Ted Lasso S03 - https://youtu.be/IR9yjn7Lkdg Ted Lasso Season 3 premieres March 15 on Apple TV+Reg ‘ol NewsRichard Belzer - The Law and Order star passed at 78. Cause of death has not been released.YouTube Anime - Viz Media, who own the rights to Naruto seasons one to eight, the complete Sailor Moon series, Hunter x Hunter seasons one to three, all 37 episodes of Death Note, the complete Inuyasha series, and seasons one and two of Mr. Osomatsu, will be making their catalog available for free on YouTube in the US.https://wegotthiscovered.com/anime/you-can-now-watch-death-note-naruto-sailor-moon-and-more-for-free-on-youtube/ SuggestsMando S03E01MoviesFollow-ups/CorrectionsSuper Mario Bros - bumped up to April 5th from the 7th for Easter weekend. Turns the opening into a 5-day affair. https://deadline.com/2023/02/super-mario-bros-movie-release-date-spring-1235274239/ Reg ‘ol NewsJoker 2 - Lady Gaga getting sued! In Feb 2021 Gaga's dogs were kidnapped. One of the 5 people who has been arrested and charged with the kidnapping, is now suing Gaga for not paying the $500,000 reward. Breach of Contract and Fraud.https://comicbook.com/movies/news/joker-folie-a-deux-lady-gaga-sued-for-not-paying-ransom-money-to-person-who-kidnapped-her-dogs/ SuggestsBirdman or (The Unexpected Virtue of Ignorance) - 2014 Michael Keaton Zach Galifianakis Edward Norton Andrea Riseborough Amy Ryan Emma Stone Naomi WattsRumor MillConfirmations/RefutationsFortnite - Creed skins were real… called it! https://comicbook.com/gaming/news/fortnite-new-creed-3-skins/ New SourcesThe Flash - George Clooney is that rumored “other” Batman… and he is going to be in the post-credit stinger.New RumorsWonderMan - Yayah Abdul-Mateen II's series set for Disney+ is rumored to be casting Ed Harris in the role of the villain. Which villain that is still remains to be seen.Loki S02 - Kang is rumored to be appearing in NO LESS than 3 episodes of the sequel series.Stones V Beatles - Paul McCartney and Ringo rumored to be making guest appearances on the next Rolling Stones album? 2021 Grammy producer of the year Andrew Watt, who won for his work with Ozzy Osborne, will be producing the album for the band.Eternals - The sequel that no one wants is reportedly being added to the Marvel slate…Secret Wars - Series rumored to be debuting in MayDeadpool 3 - Hugh Jackman playing MULTIPLE variants of Wolverine in the movie?You can support this show by visiting our merch store, or by leaving us an Apple Podcasts review.
From losing £500K of client fees in 3 weeks in Q2 2020 to selling Taylor Herring to Publicis in April 2021 - we get the inside track on perhaps the most tumultuous couple of years of any agency in recent historyOn the show this week we talk to James Herring about the story of the last few years of Taylor Herring. In Feb 2020 things were looking rosy and James and Cath were no doubt planning their summer vacation in the South of France. Then in 3 weeks, their agency lost £500,000 in monthly fee income.Today we talk to James about the turnaround job which resulted in Publicis buying Taylor Herring in April 2021 and how the agency reported a 40% increase in annual revenue in 2022 (£6.72m, up from £4.9m.) Recent client wins include Nintendo, Natwest, Iceland and McVities. The likes of Samsung, Easyjet and Disney are long-term clients.On the show, today James and I will also talk about where he sees the future of earned media within integrated communications.Thanks as ever to the PRmoment Podcast sponsors The PRCA.2.30 mins This is a story that starts at the beginning of the pandemic. What did the pandemic do to Taylor Herring?"For 6 weeks we spent quite a lot of time scratching our heads thinking about whether there was going to be a business at the end of all of this"4 mins During the worst depts of COVID - how many people were on the team, so those not on furlough?4.30 mins At the start of the pandemic, Taylor Herring lost £500K in monthly fees in 3 weeks. You'd spent nearly 20 years building Taylor Herring and seemed it seemed to be disintegrating before your eyes?8.30 mins Why did the PR market come back much quicker than we all anticipated?"We had a resoundingly good summer probably better than the summer of the year before…it was a boom summer in terms of spend"9.30 mins Did Taylor Herring approach Publicis or did they approach you? "I was mowing the lawn at 5:30 pm on a Friday afternoon and Chris (McCafferty) called"11 mins Why were they interested in a business that had so recently lost so many clients and fee income?12 mins What was the due diligence process like?"There were 2 bits to it, the informal due diligence process…and the harder end of the legal and financial due diligence - it was a full-time job for 8 weeks.""Cath runs an extremely tight ship when it comes to the organisational side of things""The process took about 18 months in all"14.30 mins Since the deal was done Taylor Herring's fee income has increased by 40% - so the earn-out is going well? "Internally we called it the third runway, it was about putting that infrastructure in ahead of the growth""We've grown from 25 to 55 people over that 2-year period"16.30 mins Is this a rare example of a PR acquisition that has worked?"We've declined more pitches than we ever have before - because when you add up the money spent on those pitches it adds up to hundreds of thousands (of pounds) in terms of the hours""The blending of social and PR and content and brand and events means there is a much bigger playground"19 mins Do Taylor Herring and MSL share many clients?20 mins How does it feel for James not owning his own business anymore?21 mins In our pre-show chat, James said prior to the sale to Publicis he'd "basically run Taylor Herring as a lifestyle business for 19 years” James talks us through what he meant by that.23 mins What is the opportunity for PR-integrated briefs? "Integrated is the single biggest opportunity for a con
Links from the show:* Vanished in Vermillion: The Real Story of South Dakota's Most Infamous Cold Case* Vanished in Vermillion website* Connect with Lou* Connect with Ryan Ray* Support the showAbout my guest:Lou Raguse is an experienced, award-winning reporter for KARE 11 News in Minneapolis.Growing up in Wheaton, Minn., Lou's love for current events was fostered through a Jeopardy-style game called “Current Events Challenge.” Social Studies teacher Russ Armstrong would give extra-credit points to students who read the newspaper and scored highest in the game. Also at Wheaton High School, Lou and his friends wrote the school newspaper “The War Whoop,” developing a flair for journalism.That led Lou to the University of Minnesota's journalism program. His senior year, Lou won the national William Randolph Hearst championship in San Francisco — establishing his place early-on as a national-level storyteller.Lou spent three years at KELO in Sioux Falls, S.D., reporting on stories such as the state's first execution in 60 years — a year after it was dramatically halted at the 11th hour. As the cops and courts beat reporter, he covered trials such as Daphne Wright's killing and dismembering of a fellow member of the deaf community. Pay attention and you might catch Lou talking about the case on various national cable crime shows.From Sioux Falls, Lou moved to the Sonoran Desert in Tucson, Ariz., along with his wife Emily, also a reporter. In Tucson, Lou helped launch FOX 11 News at Nine, anchoring the newscast for four years. Along the way, he covered the attempted assassination of Congresswoman Gabrielle Giffords, along with many hot-button border issues.After moving to Buffalo, reporting and anchoring weekends for WIVB, Lou continued to compile awards for his work — through New York State Associated Press, NY Broadcasters Association, and NY Emmys.But his finest awards came in 2013, when Lou and Emily welcomed home their little Buffalo baby Violet, and in 2016 when her little brother Westley was born.In 2015 the family moved to Minneapolis where Lou reports at KARE 11, home of the finest storytelling journalism in the country.At KARE 11, Lou has been at the forefront of some of the nation's biggest stories, leading the coverage of the death of George Floyd and the trials for the officers charged with killing him.One particular story that resonated was the kidnapping of 13-year-old Jayme Closs and her subsequent escape after 88 days in captivity. After covering the criminal case that followed, Lou produced an eight-episode podcast, “88 Days: The Jayme Closs Story,” which peaked in the top 10 on the iTunes Charts.In Feb. 2023, Lou's first true crime book was published by Post Hill Press. “Vanished in Vermillion” flips the script on the genre and reveals all the ways the 40-year search for two missing teenage girls went horribly wrong.While free time is harder to come by with a little one in the house, Lou still enjoys playing Tecmo Super Bowl on NES, competing in fantasy football leagues (including one since 1999), watching NFL football, and archiving home movies and photos from the good old days. Get full access to Dispatches from the War Room at dispatchesfromthewarroom.substack.com/subscribe
This interview is a part of our "Love is..." Series. In this series we are exploring how God defines love and how that love plays out in our everyday lives. This episode is with Krys Grant-Ray, a single mother, a community leader and a Kingdom-builder. She is a certified life coach, trained chaplain, and works full time as a relationship specialist at the Hollywood Food Coalition. In this powerful and honest conversation Krys shares about her radical view of love developed through trials of childhood trauma, mental health, and a broken marriage. She shares how, through it all, it was as if the Lord was preparing her for where she is now… facing the very real battle of stage 5 metastasized breast cancer. In Feb 2021 she was given 6 months to live with a maximum life expectancy of 2 years. Nevertheless, Krys presses on leaning into her commitment to leave a legacy of love! Her tremendous love for God and for people will inspire even the most downcast of souls. Follow Krys on Instagram @graceforgenerations @krys.grant https://artofsisterhood.org/ https://krysgrantray.com/ Follow Until We Arise on IG @untilwearise https://www.untilwearise.org/ --- Send in a voice message: https://anchor.fm/untilwearise/message
Today's Topics: 1) Free speech is one of the longest-held tenets of American society. Now, government officials and activist groups are not only limiting free speech, but forcing Americans to speak contrary to their beliefs. 2, 3) Missouri AG, Eric Schmitt learned disturbing things during a seven-hour deposition with Anthony Fauci. “In Feb ‘20 (Fauci) emailed a friend advising her masks were ineffective. Confirmed again on Mar 31,” wrote Schmitt, who was recently elected to the U.S. Senate. “On Apr 3 he's adamant masks should be worn even though he couldn't cite a single study to prove it. Mandates followed—Lives ruined." https://www-foxnews-com.cdn.ampproject.org/c/s/www.foxnews.com/politics/fauci-emailed-friend-saying-masks-ineffective-pushed-mandates-anyway-missouri-ag.amp 3, 4) Heroes and zeroes: Who stood up for the faith ... and who caved to the culture? Find out who made CatholicVote's November Heroes and Zeroes lists! https://catholicvote.org/heroes_zeroes/heroes-and-zeroes-november-2022/
Dr. Anthony Fauci was deposed by Missouri Attorney General Eric Schmitt last week, who says Fauci told a friend masks were "ineffective," but then turned around and supported mask mandates. FOX News: On Friday, the Show-Me State Republican attorney general tweeted a "tidbit" from his deposition with the outgoing head of the National Institute of Allergy and Infectious Diseases: "Another tidbit from Fauci depo: In Feb ‘20 he emailed a friend advising her masks were ineffective. Confirmed again on Mar 31. On Apr 3 he's adamant masks should be worn even though he couldn't cite a single study to prove it. Mandates followed—Lives ruined." "COVID tyranny is born," Schmitt continued. From the Daily Caller: New emails published this week shed further light on how Dr. Anthony Fauci and other top government officials responded in the opening days of the COVID-19 pandemic in early 2020. Fauci, the outgoing director of the National Institute of Allergy and Infectious Diseases (NIAID), has downplayed the possibility that COVID-19 originated in a lab setting in Wuhan, China, and has generally denied any wrongdoing or miscalculation in funding dangerous gain-of-function (GoF) research at the Wuhan Institute of Virology (WIV). New emails from early 2020 obtained via the Freedom of Information Act by reporter Jimmy Tobias show Fauci was deeply concerned about the possibility of a lab leak at the time. Get exclusive content here!: https://thepetekalinershow.com/See omnystudio.com/listener for privacy information.
Welcome to the Rock Your World Naturally Show! We are rocking your world with all things natural. It's the place where healthy women rock! In honor of Veteran's Day, Rekishia L. McMillan, aka, "The Total Health Coach" interviews Amy Walton, whose late father was a Prisoner of War in WW II and survived the infamous Bataan Death March. The day after Japan bombed the US Naval Base at Pearl Harbor, Hawaii on Dec 7, 1941, the Japanese invasion of the Philippines began. Within a month, the Japanese captured Manila, the capital of the Philippines and American and Filipino defenders of Luzon were forced to retreat to the Baatan Peninsula. Between 60,000 to 80,000 American and Filipino Prisoners of War were forced to march 65 miles. Many men died during the journey as a result of being severely beaten, receiving little to no food and water and were stabbed with batons. The conditions were disease-ridden and beyond horrific. Under the leadership of General Douglas McArthur, who in 1942 had famously promised to return to the Philippines, made good on his word. In Feb 1945, US Filipino forces recaptured the Bataan Peninsula and Manila, resulting in liberation. The Japanese commander and two of his officers were tried by the US military and were found guilty of war crimes and sentenced to death. Amy shares that her father testified as a witness during the trials. Amy provides an intimate look into her father's life and believes that the fact that he survived was a miracle, as he lived to be 91 years of age. She shares the lessons that she learned in life from her father to include resilience and the power of forgiveness. Not only did Amy's father influence her, but he made a lasting impact on her sons as well, one of which followed in his grandfather's footsteps and serves as on officer on Active Duty with the US Army. Amy shares her experience as a proud Military Mother and how her faith provided the hope and strength that she and her family needed during her sons one-year deployment to Iraq. You will be encouraged by this life changing interview. Discover more about Amy Walton => Website: https://www.holygrounding.com/ Facebook: https://www.facebook.com/amy.walton3/ Instagram: https://www.instagram.com/holygrounding/ LinkedIn: https://www.linkedin.com/in/amywalton317/ Amazon: https://www.amazon.com/Restoration-Renew-Heart-Lenten-Journey-ebook/dp/B07MP9GC7S/ref=sr_1_1?crid=3QJAOUOG6L7OG&keywords=Restoration+Amy+Walton&qid=1661875111&sprefix=restoration+amy+walton%2Caps%2C70&sr=8-1 Rekishia McMillan, also known as "The Total Health Coach," is an Ordained Minister, Certified Social Worker, Certified Integrative Nutrition Health Coach, Award-Winning Author, and Honorably retired Air Force veteran. She teaches about health for the body, soul and spirit from a Christian perspective. Her passion is to help women achieve extraordinary health from the inside out by using the Bible, culinary medicine and holistic lifestyle approaches. Follow, Like and Subscribe today @rekishiamcmillan to join a community where "Healthy Women Rock!" Rekishia's prayer is to use this platform as a way to share God's message of health, hope and love to women around the world. Living in 3 John 2. Find Rekishia online at https://www.WomenRockHealthCoaching.com
This episode discusses progress at Insilico Medicine, the AI drug development company founded by our guest, longevity pioneer Alex Zhavoronkov.1.20 In Feb 2022, Insilico got an IPF drug into phase 1 clinical trials: a first for a wholly AI-developed drug1.50 Insilico is now well-funded; its software is widely used in the pharma industry2.30 How drug development works. First you create a hypothesis about what causes a disease4.00 Pandaomics is Insilico's software to generate hypotheses. It combines 20+ AI models, and huge public data repositories6.00 This first phase is usually done in academia. It usually costs $ billions to develop a hypothesis. 95% of them fail6.50 The second phase is developing a molecule which might treat the disease7.15 This is the job of Insilico's Chemistry 42 platform7.30 The classical approach is to test thousands of molecules to see if they bind to the target protein7.50 AI, by contrast, is able to "imagine" a novel molecule which might bind to it8.00 You then test 10-15 molecules which have the desired characteristics8.20 This is done with a variety of genetic algorithms, Generative Adversarial Networks (GANs), and some Transformer networks8.35 Insilico has a “zoo” of 40 validated models10.40 Given the ten-fold improvement, why hasn't the whole drug industry adopted this process?10.50 They do all have AI groups and they are trying to change, but they are huge companies, and it takes time11.50 Is it better to invent new molecules, or re-purpose old drugs, which are already known to be safe in humans?13.00 You can't gain IP with re-purposed drugs: either somebody else “owns” them, or they are already generic15.00 The IPF drug was identified during aging research, using aging clocks, and a deep neural net trained on longitudinal data17.10 The third phase is where Insilico's other platform, InClinico, comes into play17.35 InClinico predicts the results of phase 2 (clinical efficacy) trials18.15 InClinico is trained on massive data sets about previous trials19.40 InClinico is actually Insilico's oldest system. Its value has only been ascertained now that some drugs have made it all the way through the pipeline22.05 A major pharma company asked Insilico to predict the outcome of ten of its trials22.30 Nine of these ten trials were predicted correctly23.00 But the company decided that adopting this methodology would be too much of an upheaval; it was unwilling to rely on outsiders so heavily24.15 Hedge funds and banks have no such qualms24.25 Insilico is doing pilots for their investments in biotech startups26.30 Alex is from Latvia originally, studied in Canada, started his career in the US, but Insilico was established in Hong Kong. Why?27.00 Chinese CROs, Contract Research Organisations, enable you to do research without having your own wetlab 28.00 Like Apple, Insilico designs in the US and does operations in China. You can also do clinical studies there28.45 They needed their own people inside those CROs, so had to be co-located29.10 Hong Kong still has great IP protection, financial expertise, scientific resources, and is a beautiful place to live29.40 Post-Covid, Insilico also had to set up a site in Shanghai30.35 It is very frustrating how much opposition has built up against international co-operation32.00 Anti-globalisation ideas and attitudes are bad for longevity research, and all of biotech33.20 Insilico has all the data it needs. Its bottleneck is talent35.00 Another requirement is co-operation from governments and regulators, who often struggle to sort the chaff from the wheat in self-proclaimed AI companies37.00 Longevity research is the most philanthropic activity in the world37.30 Longevity Medicine Course is available to get clinical practitioners up to speed with the sector
I was born to 2 military parents in Mayaguez, Puerto Rico. My siblings and I started school there and soon moved to Hawaii. After a few years on these 2 beautiful islands, our family of 5 decided to settle down in Texas where my brother and I finished our high school years. Moving so much throughout my childhood had made my family extremely close. For most of our years, it has always been just the 5 of us. No close relatives or family. So, when we lost my oldest sister in 2008, it was overwhelmingly devastating. Soon after my sisters death, I joined the Air Force and moved to Japan. I found love, got married and had a precious baby boy in 2012. In Feb of 2013, I was living in Maryland, on base housing at JBA. I was only 8 months postpartum when I got the knock on my door saying my husband had died. This turned me into a downward spiral. I struggled with addiction for many years after this and even made the decision to give my son over to my parents incase my addiction took my life. Thankfully, in 2017, I was able to quit using when I learned I was pregnant again. I had found love once more and was trying so damn hard to get my family back together. When this finally happened, it was one of the greatest accomplishments of my life. I've learned many new ways to cope with stress, grief, anxiety etc throughout these years I've been in recovery and in 2020, my dad passed away. I prayed that I wouldn't turn back to my old ways and I implemented these new techniques I've learned to manage my episodes of stress and depression. I'm hoping to write a book to help others like me that deal with complex grief, mental health and addiction. @nataleeeking instagram and tiktok thewashdownpodcast@gmail.com #NeverAloneAlwaysForward
Watch the live stream: Watch on YouTube About the show Sponsored by Microsoft for Startups Founders Hub. Michael #1: PythonAnywhere: Our Commitment to Providing Free Accounts via Matthew Kramer In light of Heroku's cancelling their free tiers… They believe free tiers are important for beginners Two part solution: Limit outbound internet access for free accounts “Proof of life” to keep running - 3 months for apps, 1 yr for accounts BTW, they were acquired by Anaconda Inc. Brian #2: ruff: An extremely fast Python linter, written in Rust. Announcement article: Python tooling could be much, much faster Charlie Marsh Quite the star history, as it's a new repo as of Aug 30. Now at 1.8k. It is extremely fast. I installed it and tried it on a small project. It ran so fast I thought it didn't do anything. I went and added some errors to convince myself it was running. $ time flake8 src tests ... flake8 src tests 0.29s user 0.02s system 98% cpu 0.311 total $ time ruff src/ tests/ ... ruff src/ tests/ 0.01s user 0.01s system 162% cpu 0.011 total Michael #3: Meta spins off PyTorch Foundation to make AI framework vendor neutral PyTorch, which powers Tesla Autopilot and 150K other projects, will join the Linux Foundation. Its governing board includes representatives from Nvidia, Meta, Google, Microsoft, Amazon, and AMD. The PyTorch Foundation will strive to adhere to four principles, Remaining open Maintaining neutral branding Staying fair Forging a strong technical identity According to Meta, the transition to the PyTorch Foundation will not affect any existing PyTorch code Brian #4: Two string resources Python String Methods to Know Trey Hunner F-Strings Number Formatting Cheat Sheet Brian Allan Extras Brian: In Feb, on episode 271, we talked about Seaborn's new object interface Well, it's out now in seaborn 0.12 Interesting discussion about lazy imports. Other than that, I'm good with your extra.
Barby Ingle is a best-selling author, reality personality and lives with multiple rare and chronic diseases; reflex sympathetic dystrophy(RSD), migralepsy, PALB2-var breast cancer, valley fever, endometriosis, and other pain disorders. Barby is a chronic pain educator, patient advocate, and president of the International Pain Foundation. She is also a motivational speaker and best-selling author on pain topics. Her blog, reality shows, and media appearances are used as a platform to help her become an e-Patient advocate, and she presents at healthcare conferences, speaking publicly, sharing her story, and educating and advocating for patients across the globe. She has received more than 25 commendations for her advocacy efforts, including; 2012 WEGO Health Ms. Congeniality, 2012 NAF You Are Our Hero Award, 2013 International Inspirational Luminary, 2015 IDA Impact Award, and 2016 WEGO Health Lifetime Achievement. In 2017, Barby was named a Health Information Technology Top 100 Influencer by HealthScene and Top 20 Health Influencer by Insider Monkey Magazine. In 2018, Barby received the Reality All-Star Reunion Superstar award for her Social Media efforts and Top 50 Chronic Pain Advocates. In 2020, Barby is listed in the top 50 social media advocates for Rare Diseases and top 10 Healthcare Influencers for All Marketers to Follow, 2020 PharmaVOICE100, 2020 HITMC Patient Advocate of the Year. In 2021 Barby was awarded the 2021 Medigy HITmc Music Video of the year and 2021 Arizona Capitol Times Leader of the Year; Healthcare. In Feb. 2021 Barby was listed in the top 75 social media advocates for rare diseases. Barby currently serves as the reigning Mrs. Southwest Petite USA and will compete at the National event in Aug 2022. http://www.barbyingle.com/
Andrea Johnson is a certified John Maxwell Leadership Coach and a certified DISC Behavioral Analysis Consultant. Andrea helps high performing, mission minded women grow, lead and succeed, using the six tenets of Intentional Optimism. She operates a collaborative group coaching program called “Launch from The Beach.” In Feb. 2017, Andrea's mother finished her 15-year fight with breast cancer, and the reflective nature of grief allowed her to purposefully walk through the pain and anxiety, and emerge, as if from a crucible, holding the gem of Intentional Optimism - her personal growth plan - a comprehensive approach to Personal Growth and Leadership Training viewed through the lens of hope. Andrea and I discuss Intentional Optimism and it's six tenets. We talk about how women are uniquely qualified to be leaders and how Andrea defines Unconventional Leadership. This is a great conversation for both men and women to listen to. To learn more about Andrea, please visit: http://www.theintentionaloptimist.com/
Listeners, we're back this week with Ashley K. Stoyanov Ojeda.Ashley is an author, community-builder, business development strategist, coach, and socialpreneur. Originally from Queens, NYC and born to a Mexican mom and French-American father, Ashley's career started in the music industry in 2012, working at major record labels, publishers, and venues. After relocating to Portland, OR post-college, she created her own network for local womxn songwriters, now a national organization that has been featured in The Recording Academy, called #WomxnCrush Music.Since the rapid growth of her organization, she has dedicated her career to creating opportunities and developing businesses and communities of underrepresented entrepreneurs through her coaching and consulting, and has become known as the Business Hada Madrina (Business Fairygodmother).Ashley joined The Mujerista team in 2020 to help create and grow The Mujerista Network, a digital network dedicated to empowering and celebrating the next generation of Latinas making an impact en la cultura. In Feb 2022, she'll be releasing her debut book through Mango Publishing called Jefa in Training. Ashley currently resides in Portland, Oregon. During this episode we talked about:05:40 - Heritage and being a half this half that08:43 - “La gringuita”21:25 - Privilege30:10 - Getting into music32:51 - "You are a business"35:58 - Coaching and writing40:45 - Rejection This episode is brought to you by Cox.com Follow Ashley on all things social:Instagram Follow Cafe con Pam on all things socialInstagramFacebookhttp://cafeconpam.com/Join the FREE Cafe con Pam ChallengeJoin our Discord space and let's keep the conversation going!If you are a business owner, join us for Aligned Collective MastermindLearn about PowerSistersSubscribe, rate, review, and share this episode with someone you love!And don't ever forget to Stay Shining!
We hear about people living with pain daily. When pain hits, we focus on the pain and not the cost of living in pain. She lost everything to pain, then slowly rebuilt her life and became an advocate for pain patients. Barby Ingle, the president of the International Pain Foundation.She's here to share her story and provide insight on the health and financial impact of pain. Barby Ingle is a best-selling author, reality personality, and lives with multiple rare and chronic diseases; reflex sympathetic dystrophy (RSD), migralepsy, PALB2-var breast cancer, valley fever, endometriosis and other pain disorders. She is also a motivational speaker and best-selling author on pain topics. Her blog, reality shows and media appearances are used as a platform to help her become an e-Patient advocate, and she presents at healthcare conferences, speaking publicly, sharing her story, educating and advocating for patients across the globe. She has received more than 20 accommodations over the years for her advocacy work including; 2012 WEGO Health Ms. Congeniality, 2012 NAF You Are Our Hero Award, 2013 International Inspirational Luminary, 2015 IDA Impact Award, and 2016 WEGO Health Lifetime Achievement. In 2017, Barby was named a Health Information Technology Top 100 Influencer by HealthScene and Top 20 Health Influencer by Insider Monkey Magazine. In 2018, Barby received the Reality All Star Reunion Superstar award for her Social Media efforts and Top 50 Chronic Pain Advocates. In Feb. 2021 Barby was listed in the top 75 social media advocates for RareDiseases. In 2020, Barby is listed in the top 50 social media advocates for Rare Diseases and top 10 Healthcare Influencers for All Marketers to Follow, 2020 PharmaVOICE100, 2020 HITMC Patient Advocate of the Year. In 2021 Barby was awarded the 2021 Medigy HITmc Music Video of the year and 2021 Arizona Capitol Times Leader of the Year; Healthcare. For more great information or to contact Barby, please visit: http://barbyingle.com/Valuable Links: Pharmacogenomics (PGX) - www.mygeneticmeds.com Viome - usually $300-500. You can get the Viome gut health test at http://viome.com/raceto2million on sale for $149.For more information and other valuable resources, make sure to subscribe, follow and visit our sites.Website: https://www.thevoiceofmany.com Instagram: https://www.instagram.com/theevoiceofmany/?hl=en Twitter: https://twitter.com/TheVoiceofMany3 Facebook: https://www.facebook.com/The-Voice-of-Many LinkedIn: www.linkedin.com/in/the-voice-of-many-podcast-1417a81b7 YouTube: https://www.youtube.com/channel/UCMmouE4IqrsPG2gnaERlY-ASupport the show
Today we have Iris Nevins, the co-founder and CEO of Umba Daima, a creative NFT Studio which also houses Black NFT Art and Unseen Gallery. Before becoming a tech and art entrepreneur, she was a former teacher and community activist who eventually became a software engineer, working at companies like Mailchimp and Vox Media. In Feb 2021 she left to start Umba Daima, focused on building a multi-cultural ecosystem of educational & social web3 experiences. Timecodes: 0:00 - Intro 1:07 - Web3 Clubhouse beginnings 3:51 - Iris's background 11:55 - Umba Daima's genesis 12:25 - Experience of managing artists 14:29 - What's the scope of building spaces in web3 16:19 - Umba Daima as an NFT Studio 16:37 - Iris's upcoming projects 21:09 - Will Umba Daima have its own token 23:34 - Advice entering web3 from a nontraditional tech background 26:24 - How to work in web3 28:41 - How IRL events can change the way we onboard people into web3? 32:42 - The evolution of Umba Daima? 35:47 - How to fund raising $2M? 42:31 - Where to find out more about Iris 44:27 - What's your favorite NFT
John joins the show, tonight, calling in from the Pocono Mountains in Pennsylvania! (**apologies for the audio dropping out a few times during the interview - the mountain wifi gets a bit spotty haha)John is happily married and the proud father of 2 sons (ages 5 and 1). He is a pro trumpet player and a full-time SAH dad. A big part of John's trauma and patterns come from the fact his mother passed away when he was 8 years old. As a result, John spent a great deal of his life keeping emotions buried inside him - which is where alcohol comes in...John's story with alcohol started off harmless enough - every other day, only after a "good day", etc etcDarkness started to creep in as John found his life out of sync with his own vision for himself and his life.He was making decisions but was having a hard time accepting them.Topics include:- Irish/German bloodline - father was a binge drinker in John's early years- rough childhood due to his mother passing, - using music as an outlet- picking up the trumpet in middle school,- finding that he was imitating the 'musician lifestyle' of drinking,"Life isn't about finding yourself. Life is about creating yourself."- George Bernard Shaw- looking more forward to getting smashed than performing- more about "the hang" than the music- a long-distance relationship with his wife-to-be, - taking a dream music gig that was 4+ hrs away from home- having a hard time accepting the decisions that were made- composing & recording a concept album for his son- denial...Recipe for recovery:-In Feb 2017 John & wife 1st son came into the world.-John heard a quote on a podcast about why one of his heroes stopped drinking after his son was born: "I didn't want to be a loser father". - Physical activity helped John get the buried emotions out of his system, to confront and address them (BJJ, swimming, running)John's sober date is Oct 23rd, 2019.His life has changed 180 degrees for the positive!This is truly an amazing story of recovery and redemption.
Advances in gene and cell therapies are enabling researchers, clinicians, families, and regulators to work together in incredible new ways to treat previously untreatable conditions. Listen to Christina Mayer share her efforts to advance policies that help to realize a future where gene and cell therapies are available to all individuals for all diseases. Christina is currently the Senior Manager of Government Affairs at the American Society of Gene and Cell Therapy in Milwaukee, Wisconsin. She works with federal government agencies and decision-makers to impact key components of gene and cell therapies like NIH research funding, genetic testing and screening, payment policy and patient access to approved therapies. She also contributes to the Society's work on other policy priorities, such as regulatory oversight and the responsible use of new genetic technologies. Christina has a Master of Public Administration from the University of Nebraska-Omaha. Listen with us as we imagine a future where the availability and equitable use of gene and cell therapies helps to realize the promise of a healthy future for all. Podcast Interview Questions with Christina Mayer: 1. You are currently the Senior Manager of Government Affairs at the American Society of Gene and Cell Therapy. Can you tell us about the mission of the American Society of Gene and Cell Therapy? What is your role there? 2. What an exciting and rewarding time to advocate for gene and cell therapies! ASGCT reminds us all that there are literally 1000s of clinical trials for novel therapies and over the next decade there will be about 30 approved therapies for genetic disease, not counting cancer. What can our listeners do to become more aware of these efforts and to become advocates for continued advancement and access to these life-saving and in many cases disease curing therapies? 3. In its 2020 – 2022 strategic plan, ASGCT identified access to genetic testing and screening as one of its core patient access priorities. What efforts have been made by ASGCT to advance access to genetic testing and screening in newborns? Can you describe ways that the current approach to newborn screening in the United States could be improved to enable the use of gene and cell therapies? What are ways that our listeners can get connected to your organization? 4. In Feb 2021, ASGCT provided a public comment to Advisory Committee on Heritable Diseases in Newborns and Children (ACHDNC) on the newborn screening process.In this letter, ASGCT stated its support of the Newborn Screening Saves Lives Act. Can you tell us listeners more about the history of this act? Why is it important that it get passed? What would happen if it doesn't? 5. Also, in the letter, ASGCT offers three recommendations to the ACHDNC. They are: 1) Ensure the RUSP keeps pace with treatment approvals, 2) Collaborate with and rely upon the FDA, and 3) Ensure the process to advance a disorder through the ACHDNC is transparent, predictable,and timely. Would you mind sharing the evidence that supported each of recommendations? (Perhaps, discuss the problem and why this recommendation would solve that problem) 6. ASGCT has worked partner organizations to support other NBS and you are hosting a workgroup and symposium in May to discuss advancements in NBS. Thank you for inviting Dr. Brower and NBSTRN to present during the workshop. Please tell us more about these important events and how they can participate. 7. You and Dr. Brower serve on the planning committee for an effort by EveryLife Foundation to develop actionable policy solutions aimed at ensuring newborn screening continues to advance. This includes the research facilitated by NBSTRN and ASGCT, as well as the policies that you and your team champion. Can you describe why these types of efforts to build coalitions and collaborations across different stakeholder groups are so important? 8. You have a very interesting career path and your work inspires many of us to achieve meaningful change and work to advance discoveries that save and improve lives. Can you share with our listeners what sparked your interest in the revolution that is gene and cell therapy as well as newborn screening research? 9. What does NBS research mean to you? Learn more about ASGCT annual meeting at https://annualmeeting.asgct.org To learn more about newborn screening research data tools and resources, visit www.nbstrn.org
Gift Shop, Bobcat Rehab and Feeding Volunteer Expansion Projects On May 15, 2016 I purchased a 48 x 72 foot, 1996 MH from Marty at AAA that had previously been a TECO office, for 42,000.00 It's purpose is to replace the 1200 SF Gift Shop so we can use the old Gift Shop for tour staging, inside in the A/C. Today it's getting new A/C and Victor Alonzo has been working for a week to pull out all of the rotted, moldy, walls and ceilings so we can replace and repair. I estimate that we will have 100k in it when done, but to build it would have been three times that. On June 28, 2016 we began our Bobcat Rehab expansion. That's turned out to be a $345,000.00 project that will give us 8 cages when done. We are about half way done with Cage #4 right now and will try these out for a while to be sure we like them. On Sept 28, 2016 I purchased a mobile home from Marty at AAA to put on the lot right outside our back gate where the little single wide called The Goose House sits. In Feb. 2017 I finally got AAA to move the Goose House to the side of Food Prep for Gale to use as a new Keeper Cafe, but the county is still jerking us around on permitting the new MH on the old site of the Goose House. I bought a lot of lots and trailers on Meadowview last year. Four at one time from one owner and a few others. Those projects are all going on now too, but the ones above have been my priority. That and webcams… Today one of our viewers snapped this of me, from the webcam I was trying to position out on Nabisco and Mrs Claws. Not enough signal though, so it's on Moses Bobcat for now. Hi, I'm Carole Baskin and I've been writing my story since I was able to write, but when the media goes to share it, they only choose the parts that fit their idea of what will generate views. If I'm going to share my story, it should be the whole story. The titles are the dates things happened. If you have any interest in who I really am please start at the beginning of this playlist: http://savethecats.org/ I know there will be people who take things out of context and try to use them to validate their own misconception, but you have access to the whole story. My hope is that others will recognize themselves in my words and have the strength to do what is right for themselves and our shared planet. You can help feed the cats at no cost to you using Amazon Smile! Visit BigCatRescue.org/Amazon-smile You can see photos, videos and more, updated daily at BigCatRescue.org Check out our main channel at YouTube.com/BigCatRescue Music (if any) from Epidemic Sound (http://www.epidemicsound.com) This video is for entertainment purposes only and is my opinion. Closing graphic with permission from https://youtu.be/F_AtgWMfwrk
Do you want to know how to live in intentional optimism? In this segment, Andrea Johnson discusses the six tenants to living in intentional optimism. 1. Optimistic 2. Present 3. Energetic 4. Courageous 5. Wise 6. Intentional All this complement personal growth. See video here - https://youtu.be/KN02DZoa4IA WHO IS ANDREA? My personal story of childhood obesity, bulimia and depression, early menopause, gastric bypass surgery and even adoption, are all threads in the lovely tapestry of my journey leading me here. In Feb. 2017, my mother finished her 15 year fight with breast cancer, and the reflective nature of grief allowed me to purposefully walk through the pain and anxiety, and emerge, as if from a crucible, holding the gem of Intentional Optimism (The Six Tents are: Optimistic, Present, Energetic, Courageous, Wise and Intentional)- my personal growth plan - a comprehensive approach to Personal Growth and Leadership Training viewed through the lens of hope. Never underestimate your impact on those around you. Our stories are powerful, and we have a responsibility to share. We all lead at every level, in any area, using your unique gifts. We are the answer, and the role models for future generations. I am certified by the John Maxwell Team, an international Leadership Coaching organization, and a certified DISC Behavioral Analysis Consultant. ANDREA'S CALL TO ACTION Free 30-minute consultation: https://andreajohnson.as.me/Consultation Ultimate DISC Cheat Sheet: https://www.theintentionaloptimist.com/cheat Here's the Vision Board Workshop: https://www.theintentionaloptimist.com/vision Links Page: https://www.theintentionaloptimist.com/IntentionalOptimistLinks GENESIS'S INFO https://thehello.llc/GENESISAMARISKEMP CALL TO ACTION Subscribe to GEMS with Genesis Amaris Kemp Channel, Hit the notifications bell so you don't miss any content, and share with family/friends. **REMEMBER - You do not have to let limitations or barriers keep you from achieving your success. Mind over Matter...It's time to shift and unleash your greatest potential. If you would like to be a SPONSOR or have any of your merchandise mentioned, please reach out via email at GEMSwithGenesisAmarisKemp@gmail.com --- Send in a voice message: https://anchor.fm/genesis-amaris-kemp/message Support this podcast: https://anchor.fm/genesis-amaris-kemp/support
How do you overcome the obstacles and naysayers to start your own business? How do you balance being a mom/parent and entrepreneur? How do you Be The Difference to your people to grow a thriving business? How do you give back to your community? As a female how do you survive and thrive in a male dominated industry? Listen in as local MVP Trisha Turner shares these answers and so much more! Born and raised in Saint Johns, Michigan Trisha is a graduate of MSU Landscape Architecture Bachelor's 2009 (all from taking one plant class my freshman year). She had difficulty finding a job in 2009 so she started pursing her accelerated bachelors in nursing. While working on pre-reqs she went to one last job fair in a snow storm that her Dad made her go to, and received five job offers. From there she worked in Livonia for a landscape design build firm and that really shaped the start of her career. Trisha wanted to start a family and new the commute from Howell would be too much with a kiddo so she was hired in at a Landscape Design build in Brighton. Part of her hire in was to buy in and in the winter of 2020 I decided if she buys in was not going to move forward I would start my own business. In Feb 2021 she started Turner Design Group. The company started with just Dad, her, and a $70,000 client for that first season. Now her husband Jonathan is a lead foreman and we are looking for great people for 2022 season. We do not snow plow but rather interior garden which sets us apart. My education in landscape architecture paired with extensive onsite experience provides a unique product for our clients. I'm a Dale Carnegie grad and live my life by many principles!
Hola mi gente, I hope you are doing well this week! Ugh it is such an honor to have my amazing amiga Ashley K Stoyanov here today! Ashley K. Stoyanov Ojeda is an author, business development coach and socialpreneur. Originally from Queens, NYC and born to a Mexican mom and French-American father, Ashley's career started in the music industry in 2012, working at major record labels, publishers, and venues. After relocating to Portland, OR post-college, she created her own network for local womxn songwriters, now a national organization that has been featured in The Recording Academy, called #WomxnCrush Music. Since the rapid growth of her organization, she has dedicated her career to creating opportunities and developing businesses and communities of underrepresented entrepreneurs through her coaching and consulting, and has become known as the Business Hada Madrina (Business Fairygodmother). Ashley joined The Mujerista team in 2020 to help create and grow The Mujerista Network, a digital network dedicated to empowering and celebrating the next generation of Latinas making an impact en la cultura. In Feb 2022, she'll be releasing her debut book through Mango Publishing called Jefa in Training. Ashley currently resides in Portland, Oregon. You can connect with her on Instagram at @ashleykstoyanovojeda and learn more about her on her website. IG: @cafecitoconestrellita --- Support this podcast: https://anchor.fm/estrella-serrato/support
Brett has experience in high-impact, trusted-advisor roles in growing small businesses across multiple industries. He has worked with hundreds of $1M-$10M business owners and their teams to develop and implement the organizational and leadership processes and systems they need to grow. As the original leader of Infusionsoft's "Built to Last" efforts, Brett spent 10 years helping Infusionsoft grow from $7M in revenue to over $100M. Brett's roles at Infusionsoft included: Built to Last Champion, Strategic Advisor to the CEO, VP of Infusionsoft, and VP of Leadership Development. Brett also Co-Created the Elite Programs with Infusionsoft's CEO, Clate Mask. In Feb 2018, Brett bought the Elite business from Infusionsoft and named it Elite Entrepreneurs. When he isn't busy serving Elite businesses, Brett loves family life with his beautiful wife, Sharon, and their 8 children! Find Brett on LinkedIn: https://www.linkedin.com/in/built2lastchamp/ Website: https://growwithelite.com/ 2:33 "One key to making it all happen is getting clarity with both your work partners... and life partner." - Brett Gilliland 3:39 "That family, keeping things together at home time and investing in my key relationships was really critical." - Brett Gilliland 10:22 "She inspires me and I hope that I inspire her...we're definitely a team." - Brett Gilliland 11:43 "She employs all of that genius here in our home... so, in our relationship, I find ways for her to leave the house as often I can." - Brett Gilliland 13:15 "She has personal growth things that she wants to do...so I arrange my life as a business owner to enable her." - Brett Gilliland 14:45 "All along the way, we're having to make adjustments... it's a constant collaboration." - Brett Gilliland 17:44 "If we're not creating something together, it's more likely that we start to move in different directions." - Brett Gilliland 18:55 "I want an enduring family that matters." - Brett Gilliland 22:15 "Ask your spouse, what do I do that annoys you...that life would be a little be sweeter if I didn't do this thing?" - Brian Keith 24:16 "Building that trust with yourself is what enables you to weather those storms." - Brian Keith
This week's topic is a very sensitive one (trigger warning). The Author of "No Tougher Duty, No Greater Honor", Christian Bussler grew up as an Air Force brat in the 1970's and 1980's. He joined the Marine Corps Reserves right out of high school and was assigned to MP Co "C" in Dayton, Ohio as an 0311 Rifleman. He was crossed trained as a 9051 Graves Registration/Mortuary Affairs Marine in the early 1990's. in 2003 his platoon of Mortuary Affairs specialists was activated to head to Kuwait, and he participated in the invasion of Iraq in 2003. In Feb of 2004, he volunteered to go back to Iraq and was assigned to Weapons Co 3rd Battalion, 4th Marines, where he was wounded in combat operations while fighting against Al Qaeda terrorist insurgents. After his surgeries, he volunteered once again to go back to Iraq in 2005, but this time as the Staff NCOIC of all Mortuary Affairs operations in and around Al Taqaddum, Iraq. His book "No Tougher Duty, No Greater Honor" details his experiences as a forward operating "body-bagger". He tells the stories that no one ever speaks of, about a duty that few know that exists. He had spent four years teaching himself how to write, and his book is the result of thousands of hours of writing and re-writing his difficult experiences in war. He hopes that his work can provided the answers that families of the fallen may have, and he hopes to honor the sacrifices of those who have paid the ultimate cost for freedom..Show Links www.usacares.org FB - @usacaresorg Twitter - @USACares IG - @usacares YouTube - USA Cares YouTube Guest Links https://www.linkedin.com/in/christian-bussler-376598154 https://www.amazon.com/No-Tougher-Duty-Greater-Honor/dp/1546604936 Sponsors Speakeasy Podcast Network - www.speakeasynetwork.com
Dr Randall sits down with the Minstrel of Malibu Jacqui Hylton!Jacqui Hylton singer, songwriter, composer continues to captivate her audiences locally and through her recordings. Having earned the moniker, The Minstrel of Malibu, her work as a jazz singer includes gigs at the most reputable venues: Catalina's, The Baked Potato, Beverly Hills Hotel, The Mint, Genghis Cohen, Ford Amphitheatre to name a few. As a songwriter, her sound is an acoustic blend of pop, jazz, classical and blues. Ms. Hylton recently recorded one of her original songs entitled “Beautiful” alongside Oscar nominated composer, arranger and conductor David Campbell and his 22-piece orchestra ensemble. Notable musicians on this recording also include bass player Nathan East (member of group ToTo, Eric Clapton, Barbra Streisand, Celine Dion), Vinnie Colaiuta (Sting, Frank Zappa) and pianist Tamir Hendleman (Barbra Streisand, Natalie Cole, Michael Buble).Ms. Hylton lures audiences with lush vocals and composes vibrant melodies, generously supporting songs that touch the heart of human experiences. She was classically trained on piano at a young age through the Royal Conservatory of Music which grew into a love for jazz after being introduced to her father's jazz collection which featured Billie Holiday, Sarah Vaughan and Ella Fitzgerald. These legendary musical influences spurned a love for writing, singing, stage performance and frequenting jazz clubs. After relocating from Toronto Canada to Los Angeles, she became a student of voice builder to the stars Gary Catona and vocal coach Annette Warren Smith. Ms. Hylton currently resides in Agoura Hills CA. Ms. Hylton has created a following in the Malibu and Beverly Hills communities performing at Savory Malibu, Sage Room, Malibu Music Festival, Malibu Golf to End Cancer Benefit Concerts, Sofitel Hotel with Ryan Cross, DOMA Beverly Hills, Beverly Hills Rotary Club and Nic's Beverly Hills.Ms. Hylton has shared the stage and performed with accomplished musicians bassist Reggie Hamilton, saxophonist Katisse Buckingham, drummer Joey Hereida who are members of multi Grammy award winning Billy Childs' band. 2016-2019, Ms Hylton continues to perform in the greater Los Angeles Area at distinguished jazz clubs and abroad and is a volunteer member of Maria Newman's Westwood Choir and Orchestra. Vocalist Annette Warren Smith, 95, shadow singer for Lucille Ball in Fancy Pants and Ava Gardner in Show Boat presented Ms. Hylton as a special guest performer at Catalinas jazz club in Hollywood. Ms. Hylton also joined the stage with Multi Grammy winning double bassist/orchestral arranger James Leary (Frank Sinatra, Sammy Davis Jr, Count Basie Orchestra) at The World Stage Theatre.On Dec 15th 2018 Jacqui Hylton & Friends performed a jazz concert and Holiday Toy Drive to benefit The Children's Lifesaving Foundation at Aldabella's in Westlake Village. Musicians included) Katisse Buckingham-sax and woodwinds (associated acts Herbie Hancock, Billy Childs, the late Prince), Chris Cadenhead on keys, (associated acts the late Prince, Fourplay, Justin Bieber, Tony Moore Drums, (Norman Brown, Jeff Lorber ) John Hart bass (Gladys Knight). The evening featured original compositions by Jacqui Hylton as well as jazz covers.Jan 2019- March 2020 Jacqui Hylton secured a residency at The Hilton Checkers Hotel in Los Angeles In Feb 2021 Ms. Hylton performed a virtual concert for MJCS' annual Purimspiel in Malibu, CA. Beginning Sept 2021, Ms. Hylton created Jazz Sundays, performing weekly on Sunday nights at Nonna Restaurant in Westlake Village CA.Tune in!
In Feb 1981, Joey Coyle, a Philly longshoreman, found $1.2 million in cash in the middle of the road and decided to keep it. Too bad the money belonged to the Federal Reserve and they wanted it back.
Barby Ingle is a best-selling author, reality personality, and lives with multiple rare and chronic diseases; reflex sympathetic dystrophy (RSD), migralepsy, PALB2-var breast cancer, valley fever, endometriosis and other pain disorders. Barby is a chronic pain educator, patient advocate, and president of the International Pain Foundation. She is also a motivational speaker and best-selling author on pain topics. Her blog, reality shows and media appearances are used as a platform to help her become an e-Patient advocate, and she presents at healthcare conferences, speaking publicly, sharing her story, educating and advocating for patients across the globe. She has received more than 20 accommodations over the years for her advocacy work including; 2012 WEGO Health Ms. Congeniality, 2012 NAF You Are Our Hero Award, 2013 International Inspirational Luminary, 2015 IDA Impact Award, and 2016 WEGO Health Lifetime Achievement. In 2017, Barby was named a Health Information Technology Top 100 Influencer by HealthScene and Top 20 Health Influencer by Insider Monkey Magazine. In 2018, Barby received the Reality All Star Reunion Superstar award for her Social Media efforts and Top 50 Chronic Pain Advocates. In Feb. 2021 Barby was listed in the top 75 social media advocates for RareDiseases. In 2020, Barby is listed in the top 50 social media advocates for Rare Diseases and top 10 Healthcare Influencers for All Marketers to Follow, 2020 PharmaVOICE100, 2020 HITMC Patient Advocate of the Year. In 2021 Barby was awarded the 2021 Medigy HITmc Music Video of the year and 2021 Arizona Capitol Times Leader of the Year; Healthcare.www.internationpain.org You can also check out her new movie at www.backhomeagainmovie.com
THE EMBC NETWORK featuring: ihealthradio and worldwide podcasts
Featured in the media more than 1250 times with over 20 accolades to her name; Barby's best known as being a Cheerleader of HOPE, author, and reality television personality. Ken, her husband and Barby provide motivation, inspiration and energy penny saving tips for better daily living. As an Extreme Time Saver, Barby lives life to the fullest despite living in chronic pain. Ingle is a best-selling author, reality personality, and lives with multiple rare and chronic diseases; reflex sympathetic dystrophy (RSD), migralepsy, PALB2-var breast cancer, valley fever, endometriosis and other pain disorders. Barby is a chronic pain educator, patient advocate, and president of the International Pain Foundation. She is also a motivational speaker and best-selling author on pain topics. Her blog, reality shows and media appearances are used as a platform to help her become an e-Patient advocate, and she presents at healthcare conferences, speaking publicly, sharing her story, educating and advocating for patients across the globe. She has received more than 20 accommodations over the years for her advocacy work including; 2012 WEGO Health Ms. Congeniality, 2012 NAF You Are Our Hero Award, 2013 International Inspirational Luminary, 2015 IDA Impact Award, and 2016 WEGO Health Lifetime Achievement. In 2017, Barby was named a Health Information Technology Top 100 Influencer by HealthScene and Top 20 Health Influencer by Insider Monkey Magazine. In 2018, Barby received the Reality All Star Reunion Superstar award for her Social Media efforts and Top 50 Chronic Pain Advocates. In Feb. 2021 Barby was listed in the top 75 social media advocates for RareDiseases. In 2020, Barby is listed in the top 50 social media advocates for Rare Diseases and top 10 Healthcare Influencers for All Marketers to Follow, 2020 PharmaVOICE100, 2020 HITMC Patient Advocate of the Year. In 2021 Barby was awarded the 2021 Medigy HITmc Music Video of the year and 2021 Arizona Capitol Times Leader of the Year; Healthcare. Her books and additional info can be found on her website. http://barbyingle.com/
Katie left home when she was 18 and moved to Atlanta to go to culinary school. She lived and worked there for 6 years, and then moved to NYC in November of 2015. She worked in Manhattan Fine dining restaurants and started at the bottom. In Feb 2020 she took her first Executive position, and about 4 weeks later they were shut down from covid. Katie got her real estate license in October 2020 but just after that, found out her mom had cancer, so she went to Tennessee to take care of her. She came back to NYC mid of Feb, and that is really when she started working NY Real Estate.
Barby Ingle was living her dream and in her words was taking life for granted. She trained and performed cheerleading, dance, and gymnastics starting at age 4 through college. Straight out of college she started her own cheer/dance training company. A year later she was hired by Washington State University as the head spirit program coach. She's been battling chronic pain since 1997. First with Endometriosis which resulted in a full hysterectomy and left oophorectomy. Then in 2002, I developed Reflex Sympathetic Dystrophy (RSD), which is a progressive neuro-autoimmune condition that affects multiple systems in the body and needs to be treated early so that disability does not take over and TMJ. She lost her physical abilities was bed-bound for years. Using a wheelchair to get out of bed. It took 3 years to get a proper diagnosis and another 4 years to get the proper treatment. Barby knows firsthand how hard it is to continue looking for relief, perfect answers, and then coming up against healthcare professionals who blow you off or do not believe what you are saying could actually be what you're experiencing. Barby said, "As I search for a cure, I became my own best advocate". Barby works sharing the information so that others do not have the same life struggles that she has. Even after seeing over 100 healthcare professionals, having major surgeries, that she didn't need, complications such as internal bleeding, medication interactions, kidney stones, tumors, severe constipation and so much more – she did not give up or give in! Barby was tested on her limits and realized they are past the boundaries she placed on herself. She had to become the Chief of Staff of her Own Medical Team. Barby says, "If I can do it, anyone can." We all just need support and HOPE! Barby is a best-selling author, reality personality, and lives with multiple rare and chronic diseases; reflex sympathetic dystrophy(RSD), migralepsy, PALB2-var, endometriosis, and other pain disorders. Barby is a chronic pain educator, patient advocate, and president of the International Pain Foundation. She is also a motivational speaker and best-selling author on pain topics. Her blog, reality shows, and media appearances are used as a platform to help her become an e-Patient advocate, and she presents at healthcare conferences, speaking publicly, sharing her story, educating and advocating for patients across the globe. She has received more than 20accommodations over the years for her advocacy work including; 2012 WEGO Health Ms. Congeniality, 2012 NAF You Are Our Hero Award, 2013 International Inspirational Luminary, 2015 IDA Impact Award, and 2016 WEGO Health Lifetime Achievement. In 2017, Barby was named a Health Information Technology Top 100Influencer by HealthScene and Top 20 Health Influencer by Insider Monkey Magazine. In 2018, Barby received the Reality All-Star Reunion Superstar award for her Social Media efforts and Top 50 Chronic Pain Advocates. In Feb. 2021 Barby is listed in the top 75 social media advocates for RareDiseases. In 2020, Barby is listed in the top 50 social media advocates for Rare Diseases and top10 Healthcare Influencers for All Marketers to Follow, 2020 PharmaVOICE100, 2020 HITMC Patient Advocate of the Year.Barby IngleThe Chosen
The Bacon Podcast with Brian Basilico | CURE Your Sales & Marketing with Ideas That Make It SIZZLE!
Kymberli is an author, speaker, executive coach, and professional trainer. Currently, she serves as the president-elect for NSA-Austin. In addition to her love for speaking, she teaches executive transition courses for the USAF. Kymberli is an Air Force Academy graduate. She has jumped out of planes, soloed in a plane, and purchased 36 F-15E fighter jets when she was in the Air Force. In Feb. of 2020, Kymberli released her book: I Need To Know You: How to Meet Ordinary, Extraordinary People and Improve Your Life, which is about a challenge she took to meet 100 people in 100 days. Kymberli is still in touch with 67 of those individuals 2 years later and now speaks on relationship building, which is key in our professional and personal lives. Kymberli has been blessed with the opportunity to have lived in so many wonderful places as far north and east as Maine, as far south as Texas, and also as far west as Hawaii and Okinawa, Japan. She has had the privilege to have met wonderful people everywhere and understands the importance and power of building solid-mutually beneficial relationships. What if the next person crossing your path today is the exact person you need to know tomorrow? Truth is we need each other. We were not created to live life alone. So why does it often seem so difficult to connect? In social and professional settings, Kymberli has always felt the need to make friends. Frequently she's contemplated the question, “How do I form new yet meaningful relationships?” Kymberli has discovered the key to accomplishing this challenge. While completing a challenge to meet 100 people in 100 days, Kymberli learned how to be intentional about meeting and connecting with people, and her life has never been the same. I Need To Know You gives practical and applicable tips and “how-to” examples that will help you improve your personal and professional connections. It's time to get started cultivating the relationships that will enrich your future. As you read this book, you will learn the fundamentals of growing your network at any time, keys to networking and relationship building, techniques for having exceptional and memorable conversations, approaches for uncovering opportunities to give and receive help, and ideas for meeting people. Building a vibrant network takes time. Learn how to meet ordinary, extraordinary people, and improve your life—starting today! Learn More About Kymberli - Click Here