POPULARITY
Welcome to our Romance Royale series.On this episode, we'll be talking about the King's many grannies. The royal couplings that had to happen to further the family tree. We're looking at five such couples today, James I of Scotland and Joan Beaufort,Owain Tudor and Catherine of Valois,Edmund Tudor and Margaret Beaufort,Henry VII and Elizabeth of York,And of course,James VI/I and Anna of Denmark.Let's welcome our experts Callum Watson, Nathen Amin and Mark Turnbull.Callum's Books:https://www.pen-and-sword.co.uk/1314-The-Year-of-Bannockburn-Hardback/p/49813/aid/1238https://link.springer.com/book/10.1007/978-3-030-37767-0Find Callum:https://www.youtube.com/channel/UCiesDZuBN1Z0SE3Vq3Gjz_Ahttps://drcallumwatson.blogspot.com/https://www.instagram.com/cpwatson1375/Find Nathen:https://substack.com/@nathenaminhttps://www.instagram.com/nathenamin/Get Son of Prophecy:https://www.amberley-books.com/author-community-main-page/a/nathen-amin/the-son-of-prophecy.htmlGet Nathen's Books:https://www.amberley-books.com/author-community-main-page/a/nathen-amin.htmlGet Mark's Book:https://www.pen-and-sword.co.uk/Charles-Is-Private-Life-Hardback/p/23661/aid/1238Find Mark:https://www.allegianceofblood.com/https://www.instagram.com/1642author/www.1642author.comhttps://podcasts.apple.com/gb/podcast/cavaliercast-the-civil-war-in-words/id1521758820For more history fodder please visit https://www.ifitaintbaroquepodcast.art/ and https://www.reignoflondon.com/To book a walking tour with Natalie https://www.getyourguide.com/s/?q=supplier:252243 Get bonus content on Patreon Hosted on Acast. See acast.com/privacy for more information.
This is one of our Bonus Episodes for the Katharine of Aragon Festival which happens this weekend in Peterborough, as it does every year.In this mini episode let's chat to Nathen Amin as he teases us on the topic of his upcoming talk at the festivalWelcome back, Nathen!Get Son of Prophecy:https://www.amberley-books.com/author-community-main-page/a/nathen-amin/the-son-of-prophecy.htmlGet Nathen's Books:https://www.amberley-books.com/author-community-main-page/a/nathen-amin.htmlJoin the Festival:Cathedral: https://www.ticketsource.co.uk/whats-on/peterborough/peterborough-cathedralhttps://peterborough-cathedral.org.uk/about/history/katharine-of-aragon/kofa_25/Online Tudor Talks:https://peterborough-cathedral.org.uk/event/online-tudor-talks/Pre-Book Katharine of Aragon: Spanish Princess by Heather R Darsie:https://www.amberley-books.com/author-community-main-page/d/community-heather-r-darsie/katherine-of-aragon-spanish-princess.htmlJoin Natalie on her London Walking Tours:Royal London - Anglo-Saxons to Tudors: https://www.getyourguide.com/london-l57/london-the-royal-british-kings-and-queens-walking-tour-t426011/Royal London - Stuarts to Windsors: https://www.getyourguide.com/london-l57/royal-london-georgian-and-windsor-monarchs-walking-tour-t481355/Naughty London: https://www.getyourguide.com/london-l57/london-unsavory-history-guided-walking-tour-t428452/For more history, check out https://www.ifitaintbaroquepodcast.art/ and https://www.reignoflondon.com/ Get bonus content on Patreon Hosted on Acast. See acast.com/privacy for more information.
In this episode, Jackson sits down to talk to author and historian and Katharine of Aragon Festival Speaker Nathen Amin to discuss the Welsh roots of the Tudors, and to reinsert the Welshness of Henry VII back into the historical narrative of the Tudors, which Nathen brings to us in his brand new book 'Son of Prophecy: The Rise of Henry Tudor'.This is a rerun from July 2024The Katharine of Aragon Festival runs from the 22nd to the 29th of January and is hosted by Peterborough Cathedral and Peterborough Museum. Tickets are still available follow the link lets you learn more about the Katharine of Aragon Festival Grab a copy of Son of Prophecy hereKeep to date with Nathen via his X, Instagram, website and NewsletterIf you want to get in touch with History with Jackson email: jackson@historywithjackson.co.ukPlease support us on our Patreon!To catch up on everything to do with History with Jackson head to www.HistorywithJackson.co.ukFollow us on Facebook at @HistorywithJacksonFollow us on Instagram at @HistorywithJacksonFollow us on X/Twitter at @HistorywJacksonFollow us on TikTok at @HistorywithJackson Get bonus content on Patreon Hosted on Acast. See acast.com/privacy for more information.
Ken talks with Mark Hart “One Sunday at a Time: Preparing Your Heart for Weekly Mass – Cycle C” (Ave Maria Press) and Father Nathen Cromly “Coached by Paul the Apostle: Lesson in Transformation” (Scepter Publishers). Mark's book available at: https://www.avemariapress.com/products/one-sunday-at-a-time-cycle-c and Father Cromly's book at: https://scepterpublishers.org/products/coached-by-paul-the-apostle-lessions-in-transformation?_pos=2&_sid=ab810584e&_ss=r&variant=44218634764465 Follow Mark at: www.biblegeek.com and https://www.facebook.com/MarkHart99/ Follow Father Cromly at: https://www.saintjohninstitute.org/ L'articolo Meet the Author with Ken Huck – January 16, 2025 – Mark Hart “One Sunday at a Time: Preparing Your Heart for Weekly Mass – Cycle C” and Father Nathen Cromly “Coached by Paul the Apostle: Lesson in Transformation” proviene da Radio Maria.
Pro-Wrestling and Sports Entertainment have walked a fine line with The Adult Entertainment Industry since The Monday Night Wars between WWF and WCW. Not only did the two biggest companies in pro wrestling implement adult entertainment and entertainers, but ECW during that time would as well. TNA even used Chyna for a match even after she started her adult film career. Even recently, Matt Riddle was on MLW talking about being at the AVN Awards. So, why is it in 2024 is being an adult entertainer, and being an indy wrestler an issue?Tonight's guest is indy wrestler and adult entertainer, Nadia White. They express the difficulty they've had recently with promoter's booking them, only to cancel AFTER the promoter looks them up while knowing many wrestlers now-a-days have OnlyFans Accounts. Nadia expresses their frustrations and even shares a story about how they almost didn't even get to have their first match!Nadia also expresses the sexism within pro-wrestling as we've seen male wrestlers portray adult film stars, become adult film movie directors, and while they can still get booked, Nadia expresses its more difficult for them. During the interview, we also learn who Nadia enjoyed watching when they were growing up and who they would watch wrestling with as well. Nadia also discusses training, keeping their adult entertainment name the same as their wrestling name and what's in store for Nadia White in 2025!!Nadia White's Social MediaInstagram: @NadiaWWrestlingTwitter: @NadiaWWrestlingPro Wrestling Tee's: https://www.prowrestlingtees.com/wrestler-t-shirts/nadiawhite.htmlTime Stamp00:00 Intro00:16 We've seen professional wrestlers come from all walks of life Finn Balor once had a Balor Club is for everyone shirt, AEW had a motto that wrestling is for everyone. Today I am sitting down with Nadia White an independent pro wrestler who is also an adult film star who's faced many challenges in there journey as a professional wrestler00:32 Nadia how are you today?00:58 Were you a fan growing up? 01:17 who were some favorites?02:17 How was it being trained by Davey Richards and Team Ambition?03:43 READ Nadia's TWEETS. 04:43 Pro-wrestling and adult film industry go hand-n-hand0006:49 How often does a promoter cancel and what have they said to you?07:52 Did your original opponent for your first match still work that show after refusing to work with you?08:16 They call out female wrestler who refused to wrestle them08:51 the excitement of the first match, didn't know her but messaged her about the match prior09:35 Advice from pro wrestlers like Jerry Lynn11:11 Keeping her adult film name same as wrestling name (advised to change it)12:11 Being Banned from Nathen's Hotdog Eating Contest because they are an adult film star.13:50 Any issues in the locker room or at shows? "Kids love me at shows."15:00 Promotions, wrestlers who've had issues are at the "Bible Belt of The USA" and not researching them happened last week going on for two years not getting booked for being an adult film star17:45 Has her husband faced any backlash? Wrestling being sexist 18:20 but its okay for Matt Riddle on MLW to joke about going to AVN awards19:00 Talks on Triple H not wanting Chyna in the WWE HOF for doing porn, X-Pac shouldn't be in to.21:00 How are promoters when they cancel on them26:10 What's in store for pro wrestling and Nadia in 202526:55 "I just want to wrestle"27:11 Is their anyone they want to work with or a company to work in. "AEW. TNA." "Go to Japan"29:00 What they want the fans or promoters to know about them#WrestlingCommunity #indywrestler #SupportIndyWrestling #TNA #AEW #NadiaWhite #WrestlingNews #DarkSideOfTheRingSupport this podcast at — https://redcircle.com/perchedonthetoprope/donationsAdvertising Inquiries: https://redcircle.com/brandsPrivacy & Opt-Out: https://redcircle.com/privacy
Pro-Wrestling and Sports Entertainment have walked a fine line with The Adult Entertainment Industry since The Monday Night Wars between WWF and WCW. Not only did the two biggest companies in pro wrestling implement adult entertainment and entertainers, but ECW during that time would as well. TNA even used Chyna for a match even after she started her adult film career. Even recently, Matt Riddle was on MLW talking about being at the AVN Awards. So, why is it in 2024 is being an adult entertainer, and being an indy wrestler an issue?Tonight's guest is indy wrestler and adult entertainer, Nadia White. They express the difficulty they've had recently with promoter's booking them, only to cancel AFTER the promoter looks them up while knowing many wrestlers now-a-days have OnlyFans Accounts. Nadia expresses their frustrations and even shares a story about how they almost didn't even get to have their first match!Nadia also expresses the sexism within pro-wrestling as we've seen male wrestlers portray adult film stars, become adult film movie directors, and while they can still get booked, Nadia expresses its more difficult for them. During the interview, we also learn who Nadia enjoyed watching when they were growing up and who they would watch wrestling with as well. Nadia also discusses training, keeping their adult entertainment name the same as their wrestling name and what's in store for Nadia White in 2025!!Nadia White's Social MediaInstagram: @NadiaWWrestlingTwitter: @NadiaWWrestlingPro Wrestling Tee's: https://www.prowrestlingtees.com/wrestler-t-shirts/nadiawhite.htmlTime Stamp00:00 Intro00:16 We've seen professional wrestlers come from all walks of life Finn Balor once had a Balor Club is for everyone shirt, AEW had a motto that wrestling is for everyone. Today I am sitting down with Nadia White an independent pro wrestler who is also an adult film star who's faced many challenges in there journey as a professional wrestler00:32 Nadia how are you today?00:58 Were you a fan growing up? 01:17 who were some favorites?02:17 How was it being trained by Davey Richards and Team Ambition?03:43 READ Nadia's TWEETS. 04:43 Pro-wrestling and adult film industry go hand-n-hand0006:49 How often does a promoter cancel and what have they said to you?07:52 Did your original opponent for your first match still work that show after refusing to work with you?08:16 They call out female wrestler who refused to wrestle them08:51 the excitement of the first match, didn't know her but messaged her about the match prior09:35 Advice from pro wrestlers like Jerry Lynn11:11 Keeping her adult film name same as wrestling name (advised to change it)12:11 Being Banned from Nathen's Hotdog Eating Contest because they are an adult film star.13:50 Any issues in the locker room or at shows? "Kids love me at shows."15:00 Promotions, wrestlers who've had issues are at the "Bible Belt of The USA" and not researching them happened last week going on for two years not getting booked for being an adult film star17:45 Has her husband faced any backlash? Wrestling being sexist 18:20 but its okay for Matt Riddle on MLW to joke about going to AVN awards19:00 Talks on Triple H not wanting Chyna in the WWE HOF for doing porn, X-Pac shouldn't be in to.21:00 How are promoters when they cancel on them26:10 What's in store for pro wrestling and Nadia in 202526:55 "I just want to wrestle"27:11 Is their anyone they want to work with or a company to work in. "AEW. TNA." "Go to Japan"29:00 What they want the fans or promoters to know about them#WrestlingCommunity #indywrestler #SupportIndyWrestling #TNA #AEW #NadiaWhite #WrestlingNews #DarkSideOfTheRingSupport this podcast at — https://redcircle.com/perchedonthetoprope/donationsAdvertising Inquiries: https://redcircle.com/brandsPrivacy & Opt-Out: https://redcircle.com/privacy
Why is Henry VII remembered as an intensely suspicious king, wracked by paranoia? According to Nathen Amin, the answer lies in his death-defying rise to power. In this Long Read, written by Nathen, we delve into the turbulent youth of the first Tudor monarch. HistoryExtra Long Reads brings you the best articles from BBC History Magazine, direct to your ears. Today's feature originally appeared in the September 2024 issue, and has been voiced in partnership with the RNIB. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Henry VII has gone down in history as the miserable miser who, rightly or wrongly, seized the English Crown from the hands of Richard III at the battle of Bosworth. But, according to historian and author Nathen Amin, Henry's rise to power was unprecedented – and his rotten reputation blown out of proportion. In this 'Life of the week' episode, Nathen speaks to Emily Briffett about the life and legacy of the first Tudor monarch – from his major political successes to his close family bonds. (Ad) Nathen Amin is the author of Son of Prophecy: The Rise of Henry Tudor (Amberly, 2024). Buy it now from Amazon: https://www.amazon.co.uk/Son-Prophecy-Rise-Henry-Tudor/dp/1398110477/?tag=bbchistory045-21&ascsubtag=historyextra-social-histboty. The HistoryExtra podcast is produced by the team behind BBC History Magazine. Learn more about your ad choices. Visit podcastchoices.com/adchoices
In this episode, Jackson sits down to talk to author and historian Nathen Amin to discuss the Welsh roots of the Tudors, and to reinsert the Welshness of Henry VII back into the historical narrative of the Tudors, which Nathen brings to us in his brand new book 'Son of Prophecy: The Rise of Henry Tudor'.Grab a copy of Son of Prophecy hereKeep to date with Nathen via his X, Instagram, website and NewsletterTo find out more about Gloucester History Festival head to: https://www.gloucesterhistoryfestival.co.uk/Or head to @GlosHistFest on Twitter or Instagram for more detailsIf you want to get in touch with History with Jackson email: jackson@historywithjackson.co.ukTo catch up on everything to do with History with Jackson head to www.HistorywithJackson.co.ukFollow us on Facebook at @HistorywithJacksonFollow us on Instagram at @HistorywithJacksonFollow us on X/Twitter at @HistorywJacksonFollow us on TikTok at @HistorywithJackson Get bonus content on Patreon Hosted on Acast. See acast.com/privacy for more information.
We're already well into 2024 and it's sad that people still have enough fuel to complain about various aspects of their engineering life. DORA seems to be turning into one of those problem areas.Not at every organization, but some places are turning it into a case of “hitting metrics” without caring for the underlying capabilities and conversations.Nathen Harvey is no stranger to this problem.He used to talk a lot about SRE at Google as a developer advocate. Then, he became the lead advocate for DORA when Google acquired it in 2018. His focus has been on questions like:How do we help teams get better at delivering and operating software? You and I can agree that this is an important question to ask. I'd listen to what he has to say about DORA because he's got a wealth of experience behind him, having also run community engineering at Chef Software.Before we continue, let's explore What is DORA? in Nathen's (paraphrased) words:DORA is a software research program that's been running since 2015.This research program looks to figure out:How do teams get good at delivering, operating, building, and running software? The researchers were able to draw out the concept of the metrics based on correlating teams that have good technology practices with highly robust software delivery outcomes.They found that this positively impacted organizational outcomes like profitability, revenue, and customer satisfaction.Essentially, all those things that matter to the business.One of the challenges the researchers found over the last decade was working out: how do you measure something like software delivery? It's not the same as a factory system where you can go and count the widgets that we're delivering necessarily.The unfortunate problem is that the factory mindset I think still leaks in. I've personally noted some silly metrics over the years like lines of code.Imagine being asked constantly: “How many lines of code did you write this week?”You might not have to imagine. It might be a reality for you. DORA's researchers agreed that the factory mode of metrics cannot determine whether or not you are a productive engineer. They settled on and validated 4 key measures for software delivery performance.Nathen elaborated that 2 of these measures look at throughput:[Those] two [that] look at throughput really ask two questions:* How long does it take for a change of any kind, whether it's a code change, configuration change, whatever, a change to go from the developer's workstation. right through to production?And then the second question on throughput is:* How frequently are you updating production?In plain English, these 2 metrics are:* Deployment Frequency. How often code is deployed to production? This metric reflects the team's ability to deliver new features or updates quickly.* Lead Time for Changes: Measures the time it takes from code being committed to being deployed to production. Nathen recounted his experience of working at organizations that differed in how often they update production from once every six months to multiple times a day. They're both very different types of organizations, so their perspective on throughput metrics will be wildly different. This has some implications for the speed of software delivery.Of course, everyone wants to move faster, but there's this other thing that comes in and that's stability.And so, the other two stability-oriented metrics look at:What happens when you do update production and... something's gone horribly wrong. “Yeah, we need to roll that back quickly or push a hot fix.” In plain English, they are:* Change Failure Rate: Measures the percentage of deployments that cause a failure in production (e.g., outages, bugs). * Failed Deployment Recovery Time: Measures how long it takes to recover from a failure in production. You might be thinking the same thing as me. These stability metrics might be a lot more interesting to reliability folks than the first 2 throughput metrics.But keep in mind, it's about balancing all 4 metrics. Nathen believes it's fair to say today that across many organizations, they look at these concepts of throughput and stability as tradeoffs of one another. We can either be fast or we can be stable. But the interesting thing that the DORA researchers have learned from their decade of collecting data is that throughput and stability aren't trade-offs of one another.They tend to move together. They've seen organizations of every shape and size, in every industry, doing well across all four of those metrics. They are the best performers. The interesting thing is that the size of your organization doesn't matter the industry that you're in.Whether you're working in a highly regulated or unregulated industry, it doesn't matter.The key insight that Nathen thinks we should be searching for is: how do you get there? To him, it's about shipping smaller changes. When you ship small changes, they're easier to move through your pipeline. They're easier to reason about. And when something goes wrong, they're easier to recover from and restore service.But along with those small changes, we need to think about those feedback cycles.Every line of code that we write is in reality a little bit of an experiment. We think it's going to do what we expect and it's going to help our users in some way, but we need to get feedback on that as quickly as possible.Underlying all of this, both small changes and getting fast feedback, is a real climate for learning. Nathen drew up a few thinking points from this:So what is the learning culture like within our organization? Is there a climate for learning? And are we using things like failures as opportunities to learn, so that we can ever be improving? I don't know if you're thinking the same as me already, but we're already learning that DORA is a lot more than just metrics. To Nathen (and me), the metrics should be one of the least interesting parts of DORA because it digs into useful capabilities, like small changes and fast feedback. That's what truly helps determine how well you're going to do against those performance metrics.Not saying “We are a low to medium performer. Now go and improve the metrics!”I think the issue is that a lot of organizations emphasize the metrics because it's something that can sit on an executive dashboard But the true reason we have metrics is to help drive conversations.Through those conversations, we drive improvement.That's important because currently an unfortunately noticeable amount of organizations are doing this according to Nathen:I've seen organizations [where it's like]: “Oh, we're going to do DORA. Here's my dashboard. Okay, we're done. We've done DORA. I can look at these metrics on a dashboard.” That doesn't change anything. We have to go the step further and put those metrics into action.We should be treating the metrics as a kind of compass on a map. You can use those metrics to help orient yourself and understand, “Where are we heading?”.But then you have to choose how are you going to make progress toward whatever your goal is.The capabilities enabled by the DORA framework should help answer questions like:* Where are our bottlenecks?* Where are our constraints?* Do we need to do some improvement work as a team?We also talked about the SPACE framework, which is a follow-on tool from DORA metrics. It is a framework for understanding developer productivity. It encourages teams or organizations to look at five dimensions when trying to measure something from a productivity perspective.It stands for:* S — satisfaction and well-being* P — performance* A — activity* C — communication and collaboration* E — efficiency and flowWhat the SPACE framework recommends is that youFirst, pick metrics from two to three of those five categories. (You don't need a metric from every one of those five but find something that works well for your team.)Then write down those metrics and start measuring them. Here's the interesting thing: DORA is an implementation of SPACE. You can correlate each metric with the SPACE acronym!* Lead time for changes is a measure of Efficiency and flow* Deployment frequency is an Activity* Change fail rate is about Performance.* Failed deployment recovery time is about Efficiency and flowKeep in mind that SPACE itself has no metrics. It is a framework for identifying metrics.Nathen reiterated that you can't use the space metrics because there is no such thing. I mentioned earlier how DORA is a means of identifying the capabilities that can improve the metrics.These can be technical practices like using continuous integration.But they can also be capabilities like collaboration and communication. As an example, you might look at what your change approval process looks like. You might look at how collaboration and communication have failed when you've had to send changes off to an external approval board like a CAB (change approval board).DORA's research backs the above up:What our research has shown through collecting data over the years, is that while they do exist on the whole, an external change approval body will slow you down.That's no surprise. So your change lead time is going to increase, your deployment frequency will decrease. But, at best, they have zero impact on your change fail rate. In most cases, they have a negative impact on your change fail rate. So you're failing more often.It goes back to the idea of smaller changes, faster feedback, and being able to validate that. Building in audit controls and so forth.This is something that reliability-focused engineers should be able to help with because one of the things Sebastian and I talk about a lot is embracing and managing risk effectively and not trying to mitigate it through stifling measures like CABs. In short, DORA and software reliability are not mutually exclusive concepts.They're certainly in the same universe.Nathen went as far as to say that some SRE practices necessarily get a little bit deeper than sort of the capability level that DORA has and provide even more sort of specific guidance on how to do things.He clarified a doubt I had because a lot of people have argued with me (mainly at conferences) that DORA is this thing that developers do, earlier in the SDLC.And then SRE is completely different because it focuses on the production side. The worst possible situation could be turning to developers and saying, “These 2 throughput metrics, they're yours. Make sure they go up no matter what,” and then turn to our SREs and say “Those stability metrics, they're yours. Make sure they stay good” All that does is put these false incentives in place and we're just fighting against each other.We talked a little more about the future of DORA in our podcast episode (player/link right at the top of this post) if you want to hear about that.Here are some useful links from Nathen for further research:DORA online community of practiceDORA homepage[Article] The SPACE of Developer ProductivityNathen Harvey's Linktree This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit read.srepath.com
Last weekend I was at Bosworth for their annual medieval festival and battle reenactment where I met up with historian and author Nathen Amin to talk about his latest book 'Son of Prophecy. The Rise of Henry Tudor'.You can also watch this on Youtube and listen to it on the podcast.Join Patreon for an extra bonus 11 minutes with Nathen. Get full access to British History at philippab.substack.com/subscribe
Cherie Hoeger, CEO and co-founder of Saalt, shares her incredible mission to address period poverty worldwide by building a purpose-driven business.
In this special Bonus episode, Charlie Higson welcomes back a guest from earlier in the series, Nathen Amin, a proud Welshman and self-proclaimed Henry VII super-fan. Charlie chats with Nathen about his new book Son Of Prophecy: The Rise Of Henry Tudor temporarily shining the spotlight away from Henry's notorious son, Henry VIII Hosted on Acast. See acast.com/privacy for more information.
In this episode, Jackson sits down to talk to author and historian Nathen Amin to discuss the Welsh roots of the Tudors, and to reinsert the Welshness of Henry VII back into the historical narrative of the Tudors, which Nathen brings to us in his brand new book 'Son of Prophecy: The Rise of Henry Tudor'.Grab a copy of Son of Prophecy hereKeep to date with Nathen via his X, Instagram, website and NewsletterIf you want to get in touch with History with Jackson email: jackson@historywithjackson.co.ukPlease support us on our Patreon! To catch up on everything to do with History with Jackson head to www.HistorywithJackson.co.ukFollow us on Facebook at @HistorywithJacksonFollow us on Instagram at @HistorywithJacksonFollow us on X/Twitter at @HistorywJacksonFollow us on TikTok at @HistorywithJackson Get bonus content on Patreon Hosted on Acast. See acast.com/privacy for more information.
And: our local 7-11 experiences a devastating coup d'état
Nathen Harvey is Developer Advocate and the lead for DORA at Google Cloud, their DevOps Research and Assessment unit. For ten years Nathen has spearheaded tech communities and authored several reports that now form the industry standard for measuring DevOps performance. He was once a CRM system administrator too — he knows his stuff and the challenges we all face too!Nathen joins Jack on the DevOps Diaries podcast to discuss all things metrics. In an enticing and insightful conversation, Nathen shares with us how the DORA metrics came to be, what they are, and why measuring performance is important for all teams to drive their businesses in the right direction.Nathen also shares with us some of the common pitfalls of metrics, including Goodhart's Law, and what we can do to avoid them. If you're unsure of where to start when it comes to measuring performance, or how to improve, this is the episode for you.Nathen and Jack also discuss the wider market trends and how engineering teams can up their game using the SPACE framework.Learn more:DORA 2023 reportThe State of Salesforce DevOps 2024 reportThe common pitfalls when measuring performanceHow to align metrics with business performanceConnect with NathenLinkedInX/TwitterConnect with Jack X/TwitterLinkedInPodcast produced and sponsored by Gearset, the complete Salesforce DevOps platform. Try Gearset free for 30 days.
And, for tax purposes, we read The Big Bang Theory's "Big Book of Lists"
Michael Ungaro, CEO and business owner at San Pedro Fish Market, shares how he transformed the family business with the help of EOS Implementer® Nathen Fox.
What do our vulnerable child stars need? Two words: Nathen Mazri.
What a Phenomenal Episode! You get to see the Bromance Between Coach Watson & Coach McPeek
Happy St. David's Day!As our guest today, we have Nathen Amin, and we quiz him on everything to do with Henry VII, starting with his origin story: part Tudor, part Beaufort, part Valois, this famous son of Wales has a fascinating story.Please find Nathen's books here: https://www.amberley-books.com/author-community-main-page/a/nathen-amin.htmlFor more history fodder please visit https://www.ifitaintbaroque.art/ and https://www.reignoflondon.com/If you would like to join Natalie on one of her Tudor monarchs walking tour in London, please follow the link:https://www.getyourguide.com/london-l57/london-the-royal-british-kings-and-queens-walking-tour-t426011/ Hosted on Acast. See acast.com/privacy for more information.
Transcript: bit.ly/AIAe054-1VOLORES' debut album, AGES, explores universal themes of life, love, and death through dark indie rock, stylish post-punk motifs, and disarmingly frank lyricism. The Colorado-based couple – Flogging Molly bassist Nathen Maxwell and his singer-songwriter wife Shelby – lightheartedly dub their singular sound “mountain goth.”Organic, haunting, and relentlessly authentic, VOLORES' broad appeal lies in its raw channeling of the mortal condition, including mental health struggles, that they've not only experienced, but experienced together. Simple, yet effortlessly beautiful, AGES celebrates the shared musical passions that brought the Maxwells together – from Leonard Cohen and Elliot Smith to The Cure and Interpol – through unfiltered expressions that cast deeply personal shadows in plain sight, coated only in intuitive melody and elegant songcraft.VOLORES (voloresband.com)
In this episode of the "Giant Robots Smashing Into Other Giant Robots" podcast, host Victoria Guido delves into the intersection of technology, product development, and personal passions with her guests Henry Yin, Co-Founder and CTO of Merico, and Maxim Wheatley, the company's first employee and Community Leader. They are joined by Joe Ferris, CTO of thoughtbot, as a special guest co-host. The conversation begins with a casual exchange about rock climbing, revealing that both Henry and Victoria share this hobby, which provides a unique perspective on their professional roles in software development. Throughout the podcast, Henry and Maxim discuss the journey and evolution of Merico, a company specializing in data-driven tools for developers. They explore the early stages of Merico, highlighting the challenges and surprises encountered while seeking product-market fit and the strategic pivot from focusing on open-source funding allocation to developing a comprehensive engineering metric platform. This shift in focus led to the creation of Apache DevLake, an open-source project contributed to by Merico and later donated to the Apache Software Foundation, reflecting the company's commitment to transparency and community-driven development. The episode also touches on future challenges and opportunities in the field of software engineering, particularly the integration of AI and machine learning tools in the development process. Henry and Maxim emphasize the potential of AI to enhance developer productivity and the importance of data-driven insights in improving team collaboration and software delivery performance. Joe contributes to the discussion with his own experiences and perspectives, particularly on the importance of process over individual metrics in team management. Merico (https://www.merico.dev/) Follow Merico on GitHub (https://github.com/merico-dev), Linkedin (https://www.linkedin.com/company/merico-dev/), or X (https://twitter.com/MericoDev). Apache DevLake (https://devlake.apache.org/) Follow Henry Yin on LinkedIn (https://www.linkedin.com/in/henry-hezheng-yin-88116a52/). Follow Maxim Wheatley on LinkedIn (https://www.linkedin.com/in/maximwheatley/) or X (https://twitter.com/MaximWheatley). Follow thoughtbot on X (https://twitter.com/thoughtbot) or LinkedIn (https://www.linkedin.com/company/150727/). Become a Sponsor (https://thoughtbot.com/sponsorship) of Giant Robots! Transcript: VICTORIA: This is the Giant Robots Smashing Into Other Giant Robots podcast, where we explore the design, development, and business of great products. I'm your host, Victoria Guido. And with me today is Henry Yin, Co-Founder and CTO of Merico, and Maxim Wheatley, the first employee and Community Leader of Merico, creating data-driven developer tools for forward-thinking devs. Thank you for joining us. HENRY: Thanks for having us. MAXIM: Glad to be here, Victoria. Thank you. VICTORIA: And we also have a special guest co-host today, the CTO of thoughtbot, Joe Ferris. JOE: Hello. VICTORIA: Okay. All right. So, I met Henry and Maxim at the 7CTOs Conference in San Diego back in November. And I understand that Henry, you are also an avid rock climber. HENRY: Yes. I know you were also in Vegas during Thanksgiving. And I sort of have [inaudible 00:49] of a tradition to go to Vegas every Thanksgiving to Red Rock National Park. Yeah, I'd love to know more about how was your trip to Vegas this Thanksgiving. VICTORIA: Yes. I got to go to Vegas as well. We had a bit of rain, actually. So, we try not to climb on sandstone after the rain and ended up doing some sport climbing on limestone around the Blue Diamond Valley area; a little bit light on climbing for me, actually, but still beautiful out there. I loved being in Red Rock Canyon outside of Las Vegas. And I do find that there's just a lot of developers and engineers who have an affinity for climbing. I'm not sure what exactly that connection is. But I know, Joe, you also have a little bit of climbing and mountaineering experience, right? JOE: Yeah. I used to climb a good deal. I actually went climbing for the first time in, like, three years this past weekend, and it was truly pathetic. But you have to [laughs] start somewhere. VICTORIA: That's right. And, Henry, how long have you been climbing for? HENRY: For about five years. I like to spend my time in nature when I'm not working: hiking, climbing, skiing, scuba diving, all of the good outdoor activities. VICTORIA: That's great. And I understand you were bouldering in Vegas, right? Did you go to Kraft Boulders? HENRY: Yeah, we went to Kraft also Red Spring. It was a surprise for me. I was able to upgrade my outdoor bouldering grade to B7 this year at Red Spring and Monkey Wrench. There was always some surprises for me. When I went to Red Rock National Park last year, I met Alex Honnold there who was shooting a documentary, and he was really, really friendly. So, really enjoying every Thanksgiving trip to Vegas. VICTORIA: That's awesome. Yeah, well, congratulations on B7. That's great. It's always good to get a new grade. And I'm kind of in the same boat with Joe, where I'm just constantly restarting my climbing career. So [laughs], I haven't had a chance to push a grade like that in a little while. But that sounds like a lot of fun. HENRY: Yeah, it's really hard to be consistent on climbing when you have, like, a full-time job, and then there's so much going on in life. It's always a challenge. VICTORIA: Yeah. But a great way to like, connect with other people, and make friends, and spend time outdoors. So, I still really appreciate it, even if I'm not maybe progressing as much as I could be. That's wonderful. So, tell me, how did you and Maxim actually meet? Did you meet through climbing or the outdoors? MAXIM: We actually met through AngelList, which I really recommend to anyone who's really looking to get into startups. When Henry and I met, Merico was essentially just starting. I had this eagerness to explore something really early stage where I'd get to do all of the interesting kind of cross-functional things that come with that territory, touching on product and marketing, on fundraising, kind of being a bit of everything. And I was eager to look into something that was applying, you know, machine learning, data analytics in some really practical way. And I came across what Hezheng Henry and the team were doing in terms of just extracting useful insights from codebases. And we ended up connecting really well. And I think the previous experience I had was a good fit for the team, and the rest was history. And we've had a great time building together for the last five years. VICTORIA: Yeah. And tell me a little bit more about your background and what you've been bringing to the Merico team. MAXIM: I think, like a lot of people in startups, consider myself a member of the Island of Misfit Toys in the sense that no kind of clear-cut linear pathway through my journey but a really exciting and productive one nonetheless. So, I began studying neuroscience at Georgetown University in Washington, D.C. I was about to go to medical school and, in my high school years had explored entrepreneurship in a really basic way. I think, like many people do, finding ways to monetize my hobbies and really kind of getting infected with that bug that I could create something, make money from it, and kind of be the master of my own destiny, for lack of less cliché terms. So, not long after graduating, I started my first job that recruited me into a seed-stage venture capital, and from there, I had the opportunity to help early-stage startups, invest in them. I was managing a startup accelerator out there. From there, produced a documentary that followed those startups. Not long after all of that, I ended up co-founding a consumer electronics company where I was leading product, so doing lots of mechanical, electrical, and a bit of software engineering. And without taking too long, those were certainly kind of two of the more formative things. But one way or another, I've spent my whole career now in startups and, especially early-stage ones. It was something I was eager to do was kind of take some of the high-level abstract science that I had learned in my undergraduate and kind of apply some of those frameworks to some of the things that I do today. VICTORIA: That's super interesting. And now I'm curious about you, Henry, and your background. And what led you to get the idea for Merico? HENRY: Yeah. My professional career is actually much simpler because Merico was my first company and my first job. Before Merico, I was a PhD student at UC Berkeley studying computer science. My research was an intersection of software engineering and machine learning. And back then, we were tackling this research problem of how do we fairly measure the developer contributions in a software project? And the reason we are interested in this project has to do with the open-source funding problem. So, let's say an open-source project gets 100k donations from Google. How does the maintainers can automatically distribute all of the donations to sometimes hundreds or thousands of contributors according to their varying level of contributions? So, that was the problem we were interested in. We did research on this for about a year. We published a paper. And later on, you know, we started the company with my, you know, co-authors. And that's how the story began for Merico. VICTORIA: I really love that. And maybe you could tell me just a little bit more about what Merico is and why a company may be interested in trying out your services. HENRY: The product we're currently offering actually is a little bit different from what we set out to build. At the very beginning, we were building this platform for open-source funding problem that we can give an open-source project. We can automatically, using algorithm, measure developer contributions and automatically distribute donations to all developers. But then we encountered some technical and business challenges. So, we took out the metrics component from the previous idea and launched this new product in the engineering metric space. And this time, we focus on helping engineering leaders better understand the health of their engineering work. So, this is the Merico analytics platform that we're currently offering to software engineering teams. JOE: It's interesting. I've seen some products that try to judge the health of a codebase, but it sounds like this is more trying to judge the health of the team. MAXIM: Yeah, I think that's generally fair to say. As we've evolved, we've certainly liked to describe ourselves as, you know, I think a lot of people are familiar with observability tools, which help ultimately ascertain, like, the performance of the technology, right? Like, it's assessing, visualizing, chopping up the machine-generated data. And we thought there would be a tremendous amount of value in being, essentially, observability for the human-generated data. And I think, ultimately, what we found on our journey is that there's a tremendous amount of frustration, especially in larger teams, not in looking to use a tool like that for any kind of, like, policing type thing, right? Like, no one's looking if they're doing it right, at least looking to figure out, like, oh, who's underperforming, or who do we need to yell at? But really trying to figure out, like, where are the strengths? Like, how can we improve our processes? How can we make sure we're delivering better software more reliably, more sustainably? Like how are we balancing that trade-off between new features, upgrades and managing tech debt and bugs? We've ultimately just worked tirelessly to, hopefully, fill in those blind spots for people. And so far, I'm pleased to say that the reception has been really positive. We've, I think, tapped into a somewhat subtle but nonetheless really important pain point for a lot of teams around the world. VICTORIA: Yeah. And, Henry, you said that you started it based on some of the research that you did at UC Berkeley. I also understand you leaned on the research from the DevOps research from DORA. Can you tell me a little bit more about that and what you found insightful from the research that was out there and already existed? MAXIM: So, I think what's really funny, and it really speaks to, I think, the importance in product development of just getting out there and speaking with your potential users or actual users, and despite all of the deep, deep research we had done on the topic of understanding engineering, we really hadn't touched on DORA too much. And this is probably going back about five years now. Henry and I were taking a customer meeting with an engineering leader at Yahoo out in the Bay Area. He kind of revealed this to us basically where he's like, "Oh, you guys should really look at incorporating DORA into this thing. Like, all of the metrics, all of the analytics you're building super cool, super interesting, but DORA really has this great framework, and you guys should look into it." And in hindsight, I think we can now [chuckles], honestly, admit to ourselves, even if it maybe was a bit embarrassing at the time where both Henry and I were like, "What? What is that? Like, what's Dora?" And we ended up looking into it and since then, have really become evangelists for the framework. And I'll pass it to Henry to talk about, like, what that journey has looked like. HENRY: Thanks, Maxim. I think what's cool about DORA is in terms of using metrics, there's always this challenge called Goodhart's Law, right? So, whenever a metric becomes a target, the metric cease to be a good metric because people are going to find ways to game the metric. So, I think what's cool about DORA is that it actually offers not just one metric but four key metrics that bring balance to covering both the stability and velocity. So, when you look at DORA metrics, you can't just optimize for velocity and sacrificing your stability. But you have to look at all four metrics at the same time, and that's harder to game. So, I think that's why it's become more and more popular in the industry as the starting point for using metrics for data-driven engineering. VICTORIA: Yeah. And I like how DORA also represents it as the metrics and how they apply to where you are in the lifecycle of your product. So, I'm curious: with Merico, what kind of insights do you think engineering leaders can gain from having this data that will unlock some of their team's potential? MAXIM: So, I think one of the most foundational things before we get into any detailed metrics is I think it's more important than ever, especially given that so many of us are remote, right? Where the general processes of software engineering are generally difficult to understand, right? They're nuanced. They tend to kind of happen in relative isolation until a PR is reviewed and merged. And it can be challenging, of course, to understand what's being done, how consistently, how well, like, where are the good parts, where are the bad parts. And I think that problem gets really exasperated, especially in a remote setting where no one is necessarily in the same place. So, on a foundational level, I think we've really worked hard to solve that challenge, where just being able to see, like, how are we doing? And to that point, I think what we've found before anyone even dives too deep into all of the insights that we can deliver, I think there's a tremendous amount of appetite for anyone who's looking to get into that practice of constant improvement and figuring out how to level up the work they're doing, just setting close benchmarks, figuring out, like, okay, when we talk about more nebulous or maybe subjective terms like speed, or quality, what does good look like? What does consistent look like? Being able to just tie those things to something that really kind of unifies the vocabulary is something I always like to say, where, okay, now, even if we're not focused on a specific metric, or we don't have a really particular goal in mind that we want to assess, now we're at least starting the conversation as a team from a place where when we talk about quality, we have something that's shared between us. We understand what we're referring to. And when we're talking about speed, we can also have something consistent to talk about there. And within all of that, I think one of the most powerful things is it helps to really kind of ground the conversations around the trade-offs, right? There's always that common saying: the triangle of trade-offs is where it's, like, you can have it cheap; you can have it fast, and you can have it good, but you can only have two. And I think with DORA, with all of these different frameworks with many metrics, it helps to really solidify what those trade-offs look like. And that's, for me at least, been one of the most impactful things to watch: is our global users have really started evolving their practices with it. HENRY: Yeah. And I want to add to Maxim's answer. But before that, I just want to quickly mention how our products are structured. So, Merico actually has an open-source component and a proprietary component. So, the open-source component is called Apache DevLake. It's an open-source project we created first within Merico and later on donated to Apache Software Foundation. And now, it's one of the most popular engineering metrics tool out there. And then, on top of that, we built a SaaS offering called DevInsight Cloud, which is powered by Apache DevLake. So, with DevLake, the open-source project, you can set up your data connections, connect DevLake to all of the dev tools you're using, and then we collect data. And then we provide many different flavors of dashboards for our users. And many of those dashboards are structured, and there are different questions engineering teams might want to ask. For example, like, how fast are we responding to our customer requirement? For that question, we will look at like, metrics like change lead time, or, like, for a question, how accurate is our planning for the sprint? In that case, the dashboard will show metrics relating to the percentage of issues we can deliver for every sprint for our plan. So, that's sort of, you know, based on the questions that the team wants to answer, we provide different dashboards that help them extract insights using the data from their DevOps tools. JOE: It's really interesting you donated it to Apache. And I feel like the hybrid SaaS open-source model is really common. And I've become more and more skeptical of it over the years as companies start out open source, and then once they start getting competitors, they change the license. But by donating it to Apache, you sort of sidestep that potential trust issue. MAXIM: Yeah, you've hit the nail on the head with that one because, in many ways, for us, engaging with Apache in the way that we have was, I think, ultimately born out of the observations we had about the shortcomings of other products in the space where, for one, very practical. We realized quickly that if we wanted to offer the most complete visibility possible, it would require connections to so many different products, right? I think anyone can look at their engineering toolchain and identify perhaps 7, 9, 10 different things they're using on a day-to-day basis. Oftentimes, those aren't shared between companies, too. So, I think part one was just figuring out like, okay, how do we build a framework that makes it easy for developers to build a plugin and contribute to the project if there's something they want to incorporate that isn't already supported? And I think that was kind of part one. Part two is, I think, much more important and far more profound, which is developer trust, right? Where we saw so many different products out there that claimed to deliver these insights but really had this kind of black-box approach, right? Where data goes in, something happens, insights come out. How's it doing that? How's it weighting things? What's it calculating? What variables are incorporated? All of that is a mystery. And that really leads to developers, rightfully, not having a basis to trust what's actually being shown to them. So, for us, it was this perspective of what's the maximum amount of transparency that we could possibly offer? Well, open source is probably the best answer to that question. We made sure the entirety of the codebase is something they can take a look at, they can modify. They can dive into the underlying queries and algorithms and how everything is working to gain a total sense of trust in how is this thing working? And if I need to modify something to account for some nuanced details of how our team works, we can also do that. And to your point, you know, I think it's definitely something I would agree with that one of the worst things we see in the open-source community is that companies will be kind of open source in name only, right? Where it's really more of marketing or kind of sales thing than anything, where it's like, oh, let's tap into the good faith of open source. But really, somehow or another, through bait and switch, through partial open source, through license changes, whatever it is, we're open source in name only but really, a proprietary, closed-source product. So, for us, donating the core of DevLake to the Apache Foundation was essentially our way of really, like, putting, you know, walking the talk, right? Where no one can doubt at this point, like, oh, is this thing suddenly going to have the license changed? Is this suddenly going to go closed-source? Like, the answer to that now is a definitive no because it is now part of that ecosystem. And I think with the aspirations we've had to build something that is not just a tool but, hopefully, long-term becomes, like, foundational technology, I think that gives people confidence and faith that this is something they can really invest in. They can really plumb into their processes in a deep and meaningful way with no concerns whatsoever that something is suddenly going to change that makes all of that work, you know, something that they didn't expect. JOE: I think a lot of companies guard their source code like it's their secret sauce, but my experience has been more that it's the secret shame [laughs]. HENRY: [laughs] MAXIM: There's no doubt in my role with, especially our open-source product driving our community we've really seen the magic of what a community-driven product can be. And open source, I think, is the most kind of a true expression of a community-driven product, where we have a Slack community with nearly 1,000 developers in it now. Naturally, right? Some of those developers are in there just to ask questions and answer questions. Some are intensely involved, right? They're suggesting improvements. They're suggesting new features. They're finding ways to refine things. And it really is that, like, fantastic culture that I'm really proud that we've cultivated where best idea ships, right? If you've got a good idea, throw it into a GitHub issue or a comment. Let's see how the community responds to it. Let's see if someone wants to pick it up. Let's see if someone wants to submit a PR. If it's good, it goes into production, and then the entire community benefits. And, for me, that's something I've found endlessly exciting. HENRY: Yeah. I think Joe made a really good point on the secret sauce part because I don't think the source code is our secret sauce. There's no rocket science in DevLake. If we break it down, it's really just some UI UX plus data pipelines. I think what's making DevLake successful is really the trust and collaboration that we're building with the open-source community. When it comes to trust, I think there are two aspects. First of all, trust on the metric accuracy, right? Because with a lot of proprietary software, you don't know how they are calculating the metrics. If people don't know how the metrics are calculated, they can't really trust it and use it. And secondly, is the trust that they can always use this software, and there's no vendor lock-in. And when it comes to collaboration, we were seeing many of our data sources and dashboards they were contributed not by our core developers but by the community. And the communities really, you know, bring in their insights and their use cases into DevLake and make DevLake, you know, more successful and more applicable to more teams in different areas of soft engineering. MID-ROLL AD: Are you an entrepreneur or start-up founder looking to gain confidence in the way forward for your idea? At thoughtbot, we know you're tight on time and investment, which is why we've created targeted 1-hour remote workshops to help you develop a concrete plan for your product's next steps. Over four interactive sessions, we work with you on research, product design sprint, critical path, and presentation prep so that you and your team are better equipped with the skills and knowledge for success. Find out how we can help you move the needle at tbot.io/entrepreneurs. VICTORIA: I understand you've taken some innovative approaches on using AI in your open-source repositories to respond to issues and questions from your developers. So, can you tell me a little bit more about that? HENRY: Absolutely. I self-identify as a builder. And one characteristic of builder is to always chase after the dream of building infinite things within the finite lifespan. So, I was always thinking about how we can be more productive, how we can, you know, get better at getting better. And so, this year, you know, AI is huge, and there are so many AI-powered tools that can help us achieve more in terms of delivering software. And then, internally, we had a hackathon, and there's one project, which is an AI-powered coding assistant coming out of it called DevChat. And we have made it public at devchat.ai. But we've been closely following, you know, what are the other AI-powered tools that can make, you know, software developers' or open-source maintainers' lives easier? And we've been observing that there are more and more open-source projects adopting AI chatbots to help them handle, you know, respond to GitHub issues. So, I recently did a case study on a pretty popular open-source project called LangChain. So, it's the hot kid right now in the AI space right now. And it's using a chatbot called Dosu to help respond to issues. I had some interesting findings from the case study. VICTORIA: In what ways was that chatbot really helpful, and in what ways did it not really work that well? HENRY: Yeah, I was thinking of how to measure the effectiveness of that chatbot. And I realized that there is a feature that's built in GitHub, which is the reaction to comment. So, how the chatbot works is whenever there is a new issue, the chatbot would basically retrieval-augmented generation pipeline and then using ORM to generate a response to the issue. And then there's people leave reactions to that comment by the chatbot, but mostly, it's thumbs up and thumbs down. So, what I did is I collect all of the issues from the LangChain repository and look at how many thumbs up and thumbs down Dosu chatbot got, you know, from all of the comments they left with the issues. So, what I found is that over across 2,600 issues that Dosu chatbot helped with, it got around 900 thumbs ups and 1,300 thumbs down. So, then it comes to how do we interpret this data, right? Because it got more thumbs down than thumbs up doesn't mean that it's actually not useful or harmful to the developers. So, to answer that question, I actually looked at some examples of thumbs-up and thumb-down comments. And what I found is the thumb down doesn't mean that the chatbot is harmful. It's mostly the developers are signaling to the open-source maintainers that your chatbot is not helping in this case, and we need human intervention. But with the thumbs up, the chatbot is actually helping a lot. There's one issue where people post a question, and the chatbot just wrote the code and then basically made a suggestion on how to resolve the issue. And the human response is, "Damn, it worked." And that was very surprising to me, and it made me consider, you know, adopting similar technology and AI-powered tools for our own open-source project. VICTORIA: That's very cool. Well, I want to go back to the beginning of Merico. And when you first got started, and you were trying to understand your customers and what they need, was there anything surprising in that early discovery process that made you change your strategy? HENRY: So, one challenge we faced when we first explored open-source funding allocation problem space is that our algorithm looks at the Git repository. But with software engineering, especially with open-source collaboration, there are so many activities that are happening outside of open-source repos on GitHub. For example, I might be an evangelist, and my day-to-day work might be, you know, engaging in community work, talking about the open-source project conference. And all of those things were not captured by our algorithm, which was only looking at the GitHub repository at the time. So, that was one of the technical challenge that we faced and led us to switch over to more of the system-driven metrics side. VICTORIA: Gotcha. Over the years, how has Merico grown? What has changed between when you first started and today? HENRY: So, one thing is the team size. When we just got started, we only have, you know, the three co-founders and Maxim. And now we have grown to a team of 70 team members, and we have a fully distributed team across multiple continents. So, that's pretty interesting dynamics to handle. And we learned a lot of how to build effective team and a cohesive team along the way. And in terms of product, DevLake now, you know, has more than 900 developers in our Slack community, and we track over 360 companies using DevLake. So, definitely, went a long way since we started the journey. And yeah, tomorrow we...actually, Maxim and I are going to host our end-of-year Apache DevLake Community Meetup and featuring Nathen Harvey, the Google's DORA team lead. Yeah, definitely made some progress since we've been working on Merico for four years. VICTORIA: Well, that's exciting. Well, say hi to Nathen for me. I helped takeover DevOps DC with some of the other organizers that he was running way back in the day, so [laughs] that's great. What challenges do you see on the horizon for Merico and DevLake? MAXIM: One of the challenges I think about a lot, and I think it's front of mind for many people, especially with software engineering, but at this point, nearly every profession, is what does AI mean for everything we're doing? What does the future look like where developers are maybe producing the majority of their code through prompt-based approaches versus code-based approaches, right? How do we start thinking about how we coherently assess that? Like, how do you maybe redefine what the value is when there's a scenario where perhaps all coders, you know, if we maybe fast forward a few years, like, what if the AI is so good that the code is essentially perfect? What does success look like then? How do you start thinking about what is a good team if everyone is shooting out 9 out of 10 PRs nearly every time because they're all using a unified framework supported by AI? So, I think that's certainly kind of one of the challenges I envision in the future. I think, really, practically, too, many startups have been contending with the macroclimate within the fundraising climates. You know, I think many of the companies out there, us included, had better conditions in 2019, 2020 to raise funds at more favorable valuations, perhaps more relaxed terms, given the climate of the public markets and, you know, monetary policy. I think that's, obviously, we're all experiencing and has tightened things up like revenue expectations or now higher kind of expectations on getting into a highly profitable place or, you know, the benchmark is set a lot higher there. So, I think it's not a challenge that's unique to us in any way at all. I think it's true for almost every company that's out there. It's now kind of thinking in a more disciplined way about how do you kind of meet the market demands without compromising on the product vision and without compromising on the roadmap and the strategies that you've put in place that are working but are maybe coming under a little bit more pressure, given kind of the new set of rules that have been laid out for all of us? VICTORIA: Yeah, that is going to be a challenge. And do you see the company and the product solving some of those challenges in a unique way? HENRY: I've been thinking about how AI can fulfill the promise of making developers 10x developer. I'm an early adopter and big fan of GitHub Copilot. I think it really helps with writing, like, the boilerplate code. But I think it's improving maybe my productivity by 20% to 30%. It's still pretty far away from 10x. So, I'm thinking how Merico's solutions can help fill the gap a little bit. In terms of Apache DevLake and its SaaS offering, I think we are helping with, like, the team collaboration and measuring, like, software delivery performance, how can the team improve as a whole. And then, recently, we had a spin-off, which is the AI-powered coding assistant DevChat. And that's sort of more on the empowering individual developers with, like, testing, refactoring these common workflows. And one big thing for us in the future is how we can combine these two components, you know, team collaboration and improvement tool, DevLake, with the individual coding assistant, DevChat, how they can be integrated together to empower developers. I think that's the big question for Merico ahead. JOE: Have you used Merico to judge the contributions of AI to a project? HENRY: [laughs] So, actually, after we pivot to engineering metrics, we focus now less on individual contribution because that sometimes can be counterproductive. Because whenever you visualize that, then people will sometimes become defensive and try to optimize for the metrics that measure individual contributions. So, we sort of...nowadays, we no longer offer that kind of metrics within DevLake, if that makes sense. MAXIM: And that kind of goes back to one of Victoria's earlier questions about, like, what surprised us in the journey. Early on, we had this very benevolent perspective, you know, I would want to kind of underline that, that we never sought to be judging individuals in a negative way. We were looking to find ways to make it useful, even to a point of finding ways...like, we explored different ways to give developers badges and different kind of accomplishment milestones, like, things to kind of signal their strengths and accomplishments. But I think what we've found in that journey is that...and I would really kind of say this strongly. I think the only way that metrics of any kind serve an organization is when they support a healthy culture. And to that end, what we found is that we always like to preach, like, it's processes, not people. It's figuring out if you're hiring correctly, if you're making smart decisions about who's on the team. I think you have to operate with a default assumption within reason that those people are doing their best work. They're trying to move the company forward. They're trying to make good decisions to better serve the customers, better serve the company and the product. With that in mind, what you're really looking to do is figure out what is happening within the underlying processes that get something from thought to production. And how do you clear the way for people? And I think that's really been a big kind of, you know, almost like a tectonic shift for our company over the years is really kind of fully transitioning to that. And I think, in some ways, DORA has represented kind of almost, like, a best practice for, like, processes over people, right? It's figuring out between quality and speed; how are you doing? Where are those trade-offs? And then, within the processes that account for those outcomes, how can you really be improving things? So, I would say, for us, that's, like, been kind of the number one thing there is figuring out, like, how do we keep doubling down on processes, not people? And how do we really make sure that we're not just telling people that we're on their side and we're taking a, you know, a very humanistic perspective on wanting to improve the lives of people but actually doing it with the product? HENRY: But putting the challenge on measuring individual contributions aside, I'm as curious as Joe about AI's role in software engineering. I expect to see more and more involvement of AI and gradually, you know, replacing low-level and medium-level and, in the future, even high-level tasks for humans so we can just focus on, like, the objective instead of the implementation. VICTORIA: I can imagine, especially if you're starting to integrate AI tools into your systems and if you're growing your company at scale, some of the ability to have a natural intuition about what's going on it really becomes a challenge, and the data that you can derive from some of these products could help you make better decisions and all different types of things. So, I'm kind of curious to hear from Joe; with your history of open-source contribution and being a part of many different development teams, what kind of information do you wish that you had to help you make decisions in your role? JOE: Yeah, that's an interesting question. I've used some tools that try to identify problem spots in the code. But it'd be interesting to see the results of tools that analyze problem spots in the process. Like, I'd like to learn more about how that works. HENRY: I'm curious; one question for Joe. What is your favorite non-AI-powered code scanning tool that you find useful for yourself or for your team? JOE: I think the most common static analysis tool I use is something to find the Git churn in a repository. Some of this probably is because I've worked mostly on projects these days with dynamic languages. So, there's kind of a limit to how much static analysis you can do of, you know, a Ruby or a Python codebase. But just by analyzing which parts of the application changed the most, help you find which parts are likely to be the buggiest and the most complex. I think every application tends to involve some central model. Like, if you're making an e-commerce site, then probably products are going to have a lot of the core logic, purchases will have a lot of the core logic. And identifying those centers of gravity just through the Git statistics has helped me find places that need to be reworked. HENRY: That's really interesting. Is it something like a hotspot analysis? And when you find a hotspot, then would you invest more resources in, like, refactoring the hotspot to make it more maintainable? JOE: Right, exactly. Like, you can use the statistics to see which files you should look at. And then, usually, when you actually go into the files, especially if you look at some of the changes to the files, it's pretty clear that it's become, you know, for example, a class has become too large, something has become too tightly coupled. HENRY: Gotcha. VICTORIA: Yeah. And so, if you could go back in time, five years ago and give yourself some advice when you first started along this journey, what advice would you give yourself? MAXIM: I'll answer the question in two ways: first for the company and then for myself personally. I think for the company, what I would say is, especially when you're in that kind of pre-product market fit space, and you're maybe struggling to figure out how to solve a challenge that really matters, I think you need to really think carefully about, like, how would you yourself be using your product? And if you're finding reasons, you wouldn't, like, really, really pay careful attention to those. And I think, for us, like, early on in our journey, we ultimately kind of found ourselves asking, we're like, okay, we're a smaller earlier stage team. Perhaps, like, small improvements in productivity or quality aren't going to necessarily move the needle. That's one of the reasons maybe we're not using this. Maybe our developers are already at bandwidth. So, it's not a question of unlocking more bandwidth or figuring out where there's kind of weak points or bottlenecks at that level, but maybe how can we dial in our own processes to let the whole team function more effectively. And I think, for us, like, the more we started thinking through that lens of, like, what's useful to us, like, what's solving a pain point for us, I think, in many ways, DevLake was born out of that exact thinking. And now DevLake is used by hundreds of companies around the world and has, you know, this near thousand developer community that supports it. And I think that's testament to the power of that. For me, personally, if I were to kind of go back five years, you know, I'm grateful to say there isn't a whole lot I would necessarily change. But I think if there's anything that I would, it would just to be consistently more brave in sharing ideas, right? I think Merico has done a great job, and it's something I'm so proud of for us as a team of really embracing new ideas and really kind of making sure, like, best idea ships, right? There isn't a title. There isn't a level of seniority that determines whether or not someone has a right to suggest something or improve something. And I think with that in mind, for me as a technical person but not a member of technical staff, so to speak, I think there was many occasions, for me personally, where I felt like, okay, maybe because of that, I shouldn't necessarily weigh in on certain things. And I think what I've found, and it's a trust-building thing as well, is, like, even if you're wrong, even if your suggestion may be misunderstands something or isn't quite on target, there's still a tremendous amount of value in just being able to share a perspective and share a recommendation and push it out there. And I think with that in mind, like, it's something I would encourage myself and encourage everybody else in a healthy company to feel comfortable to just keep sharing because, ultimately, it's an accuracy-by-volume game to a certain degree, right? Where if I come up with one idea, then I've got one swing at the bat. But if us as a collective come up with 100 ideas that we consider intelligently, we've got a much higher chance of maybe a handful of those really pushing us forward. So, for me, that would be advice I would give myself and to anybody else. HENRY: I'll follow the same structure, so I'll start by the advice in terms of company and advice to myself as an individual. So, for a company level, I think my advice would be fail fast because every company needs to go through this exploration phase trying to find their product-market fit, and then they will have to test, you know, a couple of ideas before they find the right fit for themselves, the same for us. And I wish that we actually had more in terms of structure in exploring these ideas and set deadlines, you know, set milestones for us to quickly test and filter out bad ideas and then accelerate the exploration process. So, fail fast would be my suggestion at the company level. From an individual level, I would say it's more adapting to my CTO role because when I started the company, I still had that, you know, graduate student hustle mindset. I love writing code myself. And it's okay if I spent 100% of my time writing code when the company was, you know, at five people, right? But it's not okay [chuckles] when we have, you know, a team of 40 engineers. So, I wish I had that realization earlier, and I transitioned to a real CTO role earlier, focusing more, like, on technical evangelism or building out the technical and non-technical infrastructure to help my engineering teams be successful. VICTORIA: Well, I really appreciate that. And is there anything else that you all would like to promote today? HENRY: So if you're, you know, engineering leaders who are looking to measure, you know, some metrics and adopt a more data-driven approach to improving your software delivery performance, check out Apache DevLake. It's open-source project, free to use, and it has some great dashboards, support, various data resources. And join our community. We have a pretty vibrant community on Slack. And there are a lot of developers and engineering leaders discussing how they can get more value out of data and metrics and improve software delivery performance. MAXIM: Yeah. And I think to add to that, something I think we've found consistently is there's plenty of data skeptics out there, rightfully so. I think a lot of analytics of every kind are really not very good, right? And so, I think people are rightfully frustrated or even traumatized by them. And for the data skeptics out there, I would invite them to dive into the DevLake community and pose your challenges, right? If you think this stuff doesn't make sense or you have concerns about it, come join the conversation because I think that's really where the most productive discussions end up coming from is not from people mutually high-fiving each other for a successful implementation of DORA. But the really exciting moments come from the people in the community who are challenging it and saying like, "You know what? Like, here's where I don't necessarily think something is useful or I think could be improved." And it's something that's not up to us as individuals to either bless or to deny. That's where the community gets really exciting is those discussions. So, I would say, if you're a data skeptic, come and dive in, and so long as you're respectful, challenge it. And by doing so, you'll hopefully not only help yourself but really help everybody, which is what I love about this stuff so much. JOE: I'm curious, does Merico use Merico? HENRY: Yes. We've been dogfooding ourself a lot. And a lot of the product improvement ideas actually come from our own dogfooding process. For example, there was one time that we look at a dashboard that has this issue change lead time. And then we found our issue, change lead time, you know, went up in the past few month. And then, we were trying to interpret whether that's a good thing or a bad thing because just looking at a single metric doesn't tell us the story behind the change in the metrics. So, we actually improved the dashboard to include some, you know, covariates of the metrics, some other related metrics to help explain the trend of the metric. So yeah, dogfooding is always useful in improving product. VICTORIA: That's great. Well, thank you all so much for joining. I really enjoyed our conversation. You can subscribe to the show and find notes along with a complete transcript for this episode at giantrobots.fm. If you have questions or comments, email us at hosts@giantrobots.fm. And you can find me on Twitter @victori_ousg. This podcast is brought to you by thoughtbot and produced and edited by Mandy Moore. Thanks for listening. See you next time.
Join Danielle Reynolds as she sits down with Nathen McEown, Chief Growth Officer and Partner-in-Charge of Consulting at Whitley Penn, in the latest episode of our DEVELOP series podcast. Gain insights into effective Business Development strategies, cross-selling efforts, and explore the meaning of having a growth mindset. Don't miss out - tune in now and stay tuned for future episodes highlighting our team of professionals.
Welcome Dr. Les to the show. Finished his EM residency in 1999, 24 years spend as an attending, estimates 130K patient seen in that time. He tells us about a busy night shift in the ED where he had to take care of an easy laceration but how the busyness of the night made him overlook something simple.I follow this up with a story that makes me look like an idiot but sometimes it's okay to laugh at yourself. Les would tell his younger self that all the work he has done is worth it but some things don't matter. Grinding is not always appreciated so killing yourself for the convenience of people is probably going to go unnoticed a lot of time.Nathen talks about how EMT's are often not listened to. If you aren't listening to your EMT, you are missing a lot of valuable info and they can help you avoid getting stuck on a differential that no longer fits the picture. Ashaley talks about challenges in critical care and flight vs the ambulance. The benefit of listening to/valuing volunteers and how much they care about their communities. Casey talks about how TV shows will often depict paramedics as stupid and give the impression that they save everyone when the reality is they have a huge knowledge base and saving lives isn't always possible.Kierra gives us a story about advocating for your patient and sticking with it even when the provider disagrees.I hope you guys enjoyed this episode. We have one more December panelist episode coming next week where we discuss a big topic I think you will really appreciate. Support the show
This episode is a special one for me since, for the first time, I have a reappearance in the show – my friend Nathen Harvey, the godfather of the DORA community! We talked about the community, the inaugural DORA Summit, and the freshly published 2023 State of DevOps Report. Nathen touched upon a couple of very interesting insights and shared a ton of advice!An interesting insight is about trunk-based development influencing burnout. Community is working hard to explain this so feel free to join the DORA community trunk-based development discussion that is scheduled for Thursday, December 7 at 6PM UTC. Join the DORA Community of Practice for details and an invitation to the discussion.Please leave a review on your favorite podcast platform or Podchaser, and subscribe to 0800-DEVOPS newsletter here.This interview is featured in 0800-DEVOPS #54 - 2023 State of DevOps Report with Nathen Harvey.[Check out podcast chapters if available on your podcast platform or use links below](0:00)Introduction (1:18)DORA community (6:49)2023 State of DevOps Report (28:25)DORA research and service organizations (31:21)Accelerate, second edition (34:36)Follow recommendations
Today we have a second special for you from our place as podcasters in residence at the Gloucester History Festival, and joining Paul today is a return rager in the form of Author and Historian Nathen Amin who has come back to rail at the more extremist elements of Ricardian culture (as very distinct from the Richard III Society) and how they go too far.We'll hear a surprising defence of Richard and how he just might not be the architect of his own downfall.You can, and should, read Nathen's book Henry VII and the Tudor Pretenders and this is available in the History Rage Bookshop and you can follow Nicola on Twitter @nathenaminIf you've not managed to make it this year then the festival returns twice in 2024, those dates are 12th April to 14th April and 7th September to 22nd September 2024.You can sign up to the Festival Mailing List at gloucesterhistoryfestival.co.uk and follow them on Twitter @gloshistfestSupport the show Hosted on Acast. See acast.com/privacy for more information.
Imagine leaking your own mugshot for clout
JOIN THE STAG ROAR COMMUNITY This episode has been published and can be heard everywhere your podcast is available. https://www.stagroar.co.nz/ In these Mini-Podcasts we explore The Fallow Deer from D.Bruce Banwell's "The Fallow Deer" New Zealand Big Game Records Series With Permission of The Halcyon Press.
Questions lead to experimentation, which leads to evidence, allowing for conclusions, and then--voila!--practice. Equipoise was a new word for Dave, Mitch, Nathen and Riley. Jeff explains that it describes a state of equilibrium at which debate on a topic is no longer required, and factuality has effectively been achieved. But in science, that state has time and again been upset by new ideas and evidence that initially seem wrong. So, who decides whether the debate is remains open, or has gone on long enough?
Talking with Nathen Montez of Slickman Welding Apparel , High Voltage Welding , and rock band Gone by 9 . I've known Nate for about 6 years the man is a hustler no other way to put it . He's a family man first and has a true love for the welding world . I hope y'all enjoy listening to the conversation as much as I did having it . If you want to buy some of his welding apparel go to Slickmanweldingapparel.com thanks for listening
In today's conversation, Heather Darsie speaks with Nathen Amin about the first Tudor king, Henry VII, whose legacy has been tarnished by unfair depictions in popular culture and history books. Amin, a renowned historian and author specializing in the Tudor period, sheds light on the real story behind Henry VII's reign, discussing his accomplishments, challenges, and the political landscape of the time. Through their engaging dialogue, Darsie and Amin aim to provide a more nuanced and accurate understanding of this pivotal figure in English history. Nathen's Books -- Love the Tudors? Read the stories of the Tudors on Tudors Dynasty! Shop Tudors Dynasty Merchandise Want a Commercial-Free Experience? Become a patron on Patreon! -- Credits: Host: Heather R. Darsie - Twitter, Books Guest: Nathen Amin - Twitter Edited by: Rebecca Larson - Twitter Voice Over: David Black Music: Ketsa, Alexander Nakarada, and Winnie the Moog via FilmMusic.io, used by EXTENDED license. --- Send in a voice message: https://anchor.fm/rebecca-larson/message Support this podcast: https://anchor.fm/rebecca-larson/support
The opening of a new venue is usually accompanied by stories of anguish and delay…or is it? We chat to Nathen Doyle on the VERY DAY he launched a new restaurant, Sunhands in Melbourne's Carlton. Nathen is also the co-owner of Heartattack & Vine, beloved for its porchetta, fine wine and good vibes. We talk about intentional hospitality and the big, bold, heartening vision that drove Nathen to open a second business. https://www.sunhands.com.au Follow Dirty Linen on Instagram https://www.instagram.com/dirtylinenpodcast Follow Dani Valent https://www.instagram.com/danivalent Follow Rob Locke (Executive Producer) https://www.instagram.com/foodwinedine/ Follow Huck (Executive Producer) https://www.instagram.com/huckstergram/ LISTEN TO OUR OTHER FOOD PODCASTS https://linktr.ee/DeepintheWeedsNetwork Dirty Linen is a food podcast hosted by Australian journalist Dani Valent. A respected restaurant critic and food industry reporter in her home town of Melbourne, Dani is a keen, compassionate observer of restaurants and the people who bring them into being. Whether it's owners, waiters, dishwashers, chefs or members of ancillary trades from tech to pottery, Dani interviews with compassion, humour and courage. Dirty Linen goes deep, both in conversations with individuals and in investigating pressing issues. Dirty Linen is an Australian food podcast produced by the Deep in the Weeds Podcast Network.
passacaglia by composer Nathen Durasamy performed by The Creation String Quartet www.nathendurasamy.com a storm on the coast by composer Timothy Arliss OBrien performed by L'abri Trio www.timothyarlissobrien.com hitt street harangue by composer Daniel Vega performed by The Pierrot Ensemble theme music: composers breathing by Reminiscent Audio Display This show is produced and published by The Poet Heroic
Send us a voicemail with YOUR 2023 predictions: (848) 863 5343 OR ELSE
This week we welcome Historian, and author of "Henry VII and the Tudor Pretenders" and "The House of Beaufort - The Bastard Line That Captured The Crown", Nathen Amin who joins us to rage that Henry VII had a perfectly legitimate claim to the throne of England.He talks to Paul and Kyle about how primogeniture is less important than people think, how Henry comes to the throne in the same way as Edward IV and that you can call Henry many things but a coward? Come on!.If you'd like to know more about this subject then why not buy a copy of his books from the History Rage Bookshop.You can follow Nathen on Twitter @NathenAminYou can follow History Rage on Twitter @HistoryRage and let us know what you wish people would just stop believing using the Hashtag #HistoryRage.You can join our 'Angry Mob' on Patreon as well. £5 per month gets you episodes 3 months in advance, the invite to put questions to future guests, and the coveted History Rage mug. Subscribe at www.patreon.com/historyrage Support the showYou can follow History Rage on Twitter @HistoryRage and let us know what you wish people would just stop believing using the Hashtag #HistoryRage. You can join our 'Angry Mob' on Patreon as well. £5 per month gets you episodes 3 months early, the invite to choose questions, entry into our prize draws and the coveted History Rage mug. Subscribe at www.patreon.com/historyrage
It's good when you have to clarify you're "not" Kanye
Nathen Harvey needs little introduction in DevOps community. He is Developer Advocate at Google, co-author of State of DevOps Report and co-author of a great book called “97 Things Every Cloud Engineer Should Know”. We talked about insights and surprises from this year's State of DevOps Report. And Nathen shared his view on recent hot takes that “DevOps is dead”
Oh geez
On the show this week, we're talking updated DevOps practices for 2022 with hosts Stephanie Wong and Chloe Condon and our guests Nathen Harvey and Derek DeBellis. Nathen and Derek start the show with a thorough discussion of DORA, the research program dedicated to helping organizations improve software delivery and operations, and the state of DevOps report that Google publishes every year. This year, the DevOps research team strengthened their focus on security and discovered that one of the biggest predictors in security practice adoption is company culture. Open, communicative, and trustful company cultures are some of the best for accepting and implementing optimized security practices. Derek tells us how company cultures are measured and scored for this purpose and Nathen talks about team and individual burnout and its affects on culture. Low, medium, high, and elite teams are another indicator of culture, and Nathen explains how teams earn their label through four keys of software delivery performance. Each year, they let the data show these four clusters of team performance. But this year there were only three, and Derek talks more about this phenomenon and why the elite cluster seems to have disappeared. When operational performance analysis was added, the four clusters reemerged and were renamed to better suit the new analysis metrics. Nathen details these four new clusters: starting, which performs neither well nor poorly and may be just starting out; flowing, teams that are performing well across throughput, stability, and operational performance; slowing teams, which don't have high throughput but excel in other areas; and retiring teams, which are reliable but not actively developing projects. We discuss how companies may shift from one cluster to another and how much context can affect this shift. We talk about key findings in the 2022 DevOps report, especially in the security space. Some of the most notable include the adoption of DevOps security practices and the decreased incidence of burnout on teams who leverage security practices. Nathen and Derek elaborate on how this year's research changed from last year and what remained the same. Nathen Harvey Nathen works with teams helping them learn about and apply the findings of our research into high performing teams. He's been involved in the DevOps community for more than a decade. Derek DeBellis Derek is a Quantitative User Experience Researcher at Google, where Derek focuses on survey research, logs analysis, and figuring out ways to measure concepts central to product development. Derek has published on Human-AI interaction, the impact of Covid-19's onset on smoking cessation, designing for NLP errors and the role of UX in ensuring privacy. Cool things of the week Try out Cloud Spanner databases at no cost with new free trial instances blog Chipotle Is Testing More Artificial Intelligence Solutions To Improve Operations article Gyfted uses Google Cloud AI/ML tools to match tech workers with the best jobs blog Interview 2022 Accelerate State of DevOps Report blog DevOps site 2022 State of the DevOps Report Report site DORA site DORA Community site SLSA site Security Software Development Framework site Westrum organizational culture site Google finds culture, not tech, is the biggest predictor of DevOps security outcomes article GCP Podcast Episode 205: DevOps with Nathen Harvey and Jez Humble podcast GCP Podcast Episode 284: State of DevOps Report 2021 with Nathen Harvey and Dustin Smith podcast GCP Podcast Episode 290: Resiliency at Shopify with Camilo Lopez and Tai Dickerson podcast What's something cool you're working on? Steph is working on talks for DevFest Nantes and a Google Cloud dev conference in London. She'll be talking about subsea fiber optics and Google Cloud networking products. Chloe is a Noogler, so she's been working on learning as much as she can! She is excited to make her podcast debut this week! Hosts Stephanie Wong and Chloe Condon
RNC Deputy Communications director (GOP.com) joins Ryan Wrecker to talk Stacey Abrams comments saying "Fetal Heartbeat" is manufactured.
Our patron Lynn requested that we read Arabiolosis by Nathen Mazri, the man who brought the world GarfieldEATS. We were soon launched into an endless downward internet spiral of Garficcinos and rancid lasagna to discover what toxic amounts of privlege can unleash upon the world. Content Warnings: In addition to our usual barnyard language, today's episode includes discussion or mention of: gender and sexual politics in Saudi Arabia and HIV/AIDS.
We speak to historian Nathen Amin about the House of Beaufort, the initially illegitimate line of John of Gaunt who enjoyed great power and influence under the Lancastrian kings and were major players in the Wars of the Roses. We learn about some of the key figures, why the Wars of the Roses may have started earlier than you think, and how they ultimately found one of their own on the throne with Henry VII.To hear more from Nathen, you can follow him on Twitter where he is @NathenAmin Our GDPR privacy policy was updated on August 8, 2022. Visit acast.com/privacy for more information.
Pulpmx30 code to save at FXR Racing.com and Pulp20 code at Race Tech to save. We talk to the Wisconsin privateer Nathen Laporte about his time racing the nationals, his thoughts on trying SX this year, racing some AX, what he does for a living, having his wife be his mechanic and more.
Welcome to another Game Club episode of The Hardcore Gamerz Show! This month the Gamerz played two games for the game of the month. Toem by Something We Made, and Transistor from Supergiant Games. Listen to Michael, Vincent, and Nathen review these stylish indie games now! 00:01:26 Game Club - Toem 00:22:19 Game Club - Transistor If you want to get in on the discussion join our discord here: https://discord.gg/wxbNkGUcrg Not on Discord? Send us your thoughts at HGZgameclub@gmail.com Be sure to subscribe so you don't miss our next Hardcore Gamerz Show and the Game Club pick for June! Say hi on Twitter The Hardcore Gamerz Show (@TheHGZShow) Michael Koval (@Sensei_Dank) Vincent Hand (@hylian_himbo) Nathen Ludahl (@iguessnathen) Delane Cunningham (@dellahalter) Tell a friend about the podcast, and thank you so much for your support! --- This episode is sponsored by · Anchor: The easiest way to make a podcast. https://anchor.fm/app
Imagine if that restaurant from Five Nights At Freddy's was a discord server
Brake out youre 'Bingo' cards and mark off everytime Nathen says "im gay"
This week I have long time friend of the band Flogging Molly, Nathen Maxwell. I did one of my first tours ever sharing a bus with him, over 20 years ago! We talk about how Nathen got his start in music and all of the projects he finds himself involved in. He’s one of my favorite people and musicians and I’m glad he took the time to come on my show.
This week I have long time friend of the band Flogging Molly, Nathen Maxwell. I did one of my first tours ever sharing a bus with him, over 20 years ago! We talk about how Nathen got his start in music and all of the projects he finds himself involved in. He’s one of my favorite people and musicians and I’m glad he took the time to come on my show.
Special sauce guest is Nathan Orr He shows us how to Understand The Construction Bid, sets you up for success….