American politician
POPULARITY
Aaron Beam discusses Red Fang's new collection and two decades of riff-driven sludge metal, plus Justin Benlolo of BRKN Love joins the show TL;DR This week on the Loaded Radio Podcast, Aaron Beam from Red Fang joins us to celebrate the band's 20th anniversary and discuss the release of their latest collection, Deep Cuts. Beam dives into the band's beginnings, the process of assembling rare tracks, and the inspiration behind Red Fang's unrelenting sound. Later in the episode, Justin Benlolo of BRKN Love drops in to talk about the band's ongoing evolution and the exciting plans ahead. Red Fang reflect on two decades of sludge-soaked greatness Formed in 2005 in Portland, Oregon, Red Fang carved out a distinct place in the modern heavy scene with their gritty, hook-laden sludge metal. Over the last 20 years, they've delivered one crushing release after another, gaining a reputation for both their anthemic riffs and their offbeat sense of humor. Now, Aaron Beam and the rest of Red Fang mark their 20th year with Deep Cuts — a sprawling 26-track release packed with non-album tracks, covers, demos, and hidden gems. It's a celebration of the band's journey so far, offering longtime fans a treasure trove of rarely heard material that still captures the raw, unfiltered energy Red Fang are known for. On the latest episode of the Loaded Radio Podcast, Beam opens up about the creation of Deep Cuts, shares reflections on the early days of jamming in drummer John Sherman's basement, and talks about what it feels like to still be making music together two decades later. Deep Cuts offers a full throttle dive into Red Fang's evolution From the thunderous opening riff of “Antidote” to the atmospheric shifts of tracks like “Hollow Light” and “Endless Sea,” Deep Cuts showcases the full range of Red Fang's capabilities. Beam notes that the collection captures the oddities, covers, and experimental moments that have fueled the band's evolution over the years. Highlights include their gritty renditions of Wipers' “Over the Edge” and Dust's “Suicide,” as well as fan-favorite deep cuts like “Stereo Nucleosis” and “Black Water.” It's a wild ride through the different textures and tones that have shaped Red Fang's career — from straight-up sludge anthems to more experimental, synth-laced jams. Beam also reflects on how his own musical journey — and the shared history with bandmates Bryan Giles, David Sullivan, and John Sherman — continues to push Red Fang's sound forward even after 20 years. Justin Benlolo of BRKN Love joins the conversation The second half of this week's podcast features Justin Benlolo, the dynamic frontman of BRKN Love. Benlolo and Loaded Radio's Scott Penfold dig into the band's unique blend of hard rock and alternative influences, their songwriting process, and their ambitions as they continue building momentum within the modern rock landscape. With new music and major tour plans on the horizon, BRKN Love are positioning themselves as one of the exciting young acts keeping heavy music alive in a constantly evolving scene.
We Open the Show talking all things NFL Draft, the Chiefs plan, and talking a little about the positives & negatives of John Sherman's tenure.See omnystudio.com/listener for privacy information.
In Episode #64, interview, host John Sherman interviews seventh grader Dylan Pothier, his mom Bridget and his teach Renee DiPietro. Dylan is a award winning student author who is converend about AI risk.(FULL INTERVIEW STARTS AT 00:33:34)Sam Altman/Chris Anderson @ TEDhttps://www.youtube.com/watch?v=5MWT_doo68kCheck out our partner channel: Lethal Intelligence AILethal Intelligence AI - Home https://lethalintelligence.aiFOR HUMANITY MONTHLY DONATION SUBSCRIPTION LINKS:$1 MONTH https://buy.stripe.com/7sI3cje3x2Zk9SodQT$10 MONTH https://buy.stripe.com/5kAbIP9Nh0Rc4y46oo$25 MONTH https://buy.stripe.com/3cs9AHf7B9nIggM4gh$100 MONTH https://buy.stripe.com/aEU007bVp7fAfcI5kmBUY LOUIS BERMAN'S NEW BOOK ON AMAZON!!!https://a.co/d/8WSNNuoGet Involved!EMAIL JOHN: forhumanitypodcast@gmail.comSUPPORT PAUSE AI: https://pauseai.info/SUPPORT STOP AI: https://www.stopai.info/SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE! / @doomdebates
In an emotional interview, host John Sherman interviews Poornima Rao and Balaji Ramamurthy, the parents of Suchir Balaji. (FULL INTERVIEW STARTS AT 00:18:38)Suchir Balaji was a 26-year-old artificial intelligence researcher who worked at OpenAI. He was involved in developing models like GPT-4 and WebGPT. In October 2024, he publicly accused OpenAI of violating U.S. copyright laws by using proprietary data to train AI models, arguing that such practices harmed original content creators. His essay, "When does generative AI qualify for fair use?", gained attention and was cited in ongoing lawsuits against OpenAI. Suchir left OpenAI in August 2024, expressing concerns about the company's ethics and the potential harm of AI to humanity. He planned to start a nonprofit focused on machine learning and neuroscience. On October 23, 2024 he was featured in the New York Times speaking out against OpenAI.On November 26, 2024, he was found dead in his San Francisco apartment from a gunshot wound. The initial autopsy ruled it a suicide, noting the presence of alcohol, amphetamines, and GHB in his system. However, his parents contested this finding, commissioning a second autopsy that suggested a second gunshot wound was missed in the initial examination. They also pointed to other injuries and questioned the presence of GHB, suggesting foul play. Despite these claims, authorities reaffirmed the suicide ruling. The case has attracted public attention, with figures like Elon Musk and Congressman Ro Khanna calling for further investigation.Suchir's parents continue to push for justice and truth.Suchir's Website:https://suchir.net/fair_use.htmlFOR HUMANITY MONTHLY DONATION SUBSCRIPTION LINKS:$1 MONTH https://buy.stripe.com/7sI3cje3x2Zk9SodQT$10 MONTH https://buy.stripe.com/5kAbIP9Nh0Rc4y46oo$25 MONTH https://buy.stripe.com/3cs9AHf7B9nIggM4gh$100 MONTH https://buy.stripe.com/aEU007bVp7fAfcI5kmLethal Intelligence AI - Home https://lethalintelligence.aiBUY LOUIS BERMAN'S NEW BOOK ON AMAZON!!!https://a.co/d/8WSNNuoGet Involved!EMAIL JOHN: forhumanitypodcast@gmail.comSUPPORT PAUSE AI: https://pauseai.info/SUPPORT STOP AI: https://www.stopai.info/SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE! / @doomdebates
The Royals had the perfect setup to win on opening day when they started playing giveaway with the Cleveland Guardians. It's only one game and we won't over react here, but we do want to look at what we saw and ask if some of these things are previews of more to come. Meanwhile, Mayor Q explains how he's going to have $1.4 billion for the Royals to stay in Kansas City as owner John Sherman pivots from using the term "downtown" as his preferred destination. The elite college teams are putting 100 or more points on the board and it's stunning to see how great they look and our Song of the Week is a new release that I really love.
John Sherman, Royals Owner, joins Pete Mundo to discuss the Royals season and an update on potential stadium news! See omnystudio.com/listener for privacy information.
The Kansas City Royals are coming off their best season in almost a decade after winning a playoff series in 2024. In a live broadcast from Kauffman Stadium, KCUR's Up To Date spoke with Royals owner John Sherman, general manager J.J. Picollo and more about the upcoming season and stadium negotiations.
John Sherman spoke on the attempt to add a bat and a lineup change full 496 Thu, 27 Mar 2025 16:17:41 +0000 deLeeDpt1R7HKTLfJF4bpEeAXsTZr7oa mlb,kansas city royals,society & culture Cody & Gold mlb,kansas city royals,society & culture John Sherman spoke on the attempt to add a bat and a lineup change Hosts Cody Tapp & Alex Gold team up for 610 Sports Radio's newest mid-day show "Cody & Gold." Two born & raised Kansas Citians, Cody & Gold have been through all the highs and lows as a KC sports fan and they know the passion Kansas City has for their sports teams."Cody & Gold" will be a show focused on smart, sports conversation with the best voices from KC and around the country. It will also feature our listeners with your calls, texts & tweets as we want you to be a part of the show, not just a listener. Cody & Gold, weekdays 10a-2p on 610 Sports Radio. 2024 © 2021 Audacy, Inc. Society & Culture False https://player.amperwavepo
Royals Chairman and CEO John Sherman joins full 593 Thu, 27 Mar 2025 16:48:58 +0000 1YzWzo7rfFz2H2ArY8570Tz2XbRnuGfl mlb,kansas city royals,society & culture Cody & Gold mlb,kansas city royals,society & culture Royals Chairman and CEO John Sherman joins Hosts Cody Tapp & Alex Gold team up for 610 Sports Radio's newest mid-day show "Cody & Gold." Two born & raised Kansas Citians, Cody & Gold have been through all the highs and lows as a KC sports fan and they know the passion Kansas City has for their sports teams."Cody & Gold" will be a show focused on smart, sports conversation with the best voices from KC and around the country. It will also feature our listeners with your calls, texts & tweets as we want you to be a part of the show, not just a listener. Cody & Gold, weekdays 10a-2p on 610 Sports Radio. 2024 © 2021 Audacy, Inc. Society & Culture False https://player.amperwavepodcasting.com?feed-link=h
Royals owner John Sherman comments on new stadium full 2277 Thu, 27 Mar 2025 16:01:32 +0000 nBsomswzTp7OoUT7W6h8XjouyXcrPknt news The Jayme & Grayson Podcast news Royals owner John Sherman comments on new stadium Catch each and every hour of Midday with Jayme & Grayson as they discuss the hot topics in Kansas City and around the country... 2024 © 2021 Audacy, Inc. News False https://player.amperwavepodcasting.com?feed
John Sherman, Cole Ragans, and Matt Quatraro audio in the Need to Know, NBA Scout and former Mizzou Tiger Jarrett Sutton joins us to talk College Hoops and Transfer Portal craziness, and a Michael Jordan Kicker!See omnystudio.com/listener for privacy information.
The Drive played what Royals CEO and Owner John Sherman said when he spoke to the media about the future of the stadium.
In a remarkable video tour of the Oval Office, President Trump has made us all proud that he has basically turned his work space into a functional American History Museum. The Department of Education has been cut in half and now the other half is about to get whacked. This is fun... and important. We explain why. You've likely never heard of Natalie Winters but you have to hear this clip today if you don't hear anything else. She appeared on a show with Christopher Steele of the infamous Steele dossier and she just let him have it right to his face. Royals owner John Sherman spent an inning on tv during the Royals game Wednesday night and indicates we have a few more months to wait before we learn anything about the team's plans for a new ball park. KU and MU tip off tonight in the tourney, we have your previews. A new local golf tournament is low key and if you get in on the ground floor and play or sponsor, you can make a big difference. Then, a United Airlines pilot makes a stern announcement to passengers on final approach.
Aaron Beam is the singer and bass player for Portland based Red Fang. In celebration of 20 years in the riff business, Red Fang releases Deep Cuts, an extensive 26-song collection of non-album tracks, covers, and previously unreleased singles, out now via Relapse Records. In this episode, Aaron paints a picture of the of the early aughts Portland scene and how it provided the exact ingredients necessary to form Red Fang. Joe and Aaron discuss how Deep Cuts is more than a collection of rarities and recollection, and why it's more of illustrative assemblage of songs that takes them out of the Stoner Rock corner. We learn Red Fangs approach to song writing, why Aaron views it as both technical and dumb and how super drummer John Sherman ties it all up with his magical touch. Aaron catches us up on his current rock endeavors, we hear a medley of tunes from Deep Cuts, and for the first time ever on Tour Stories, we take questions from a super fan. Red Hang Relapse Records Izotope is the leader in audio repair, mixing and mastering. Ruinous uses Izotope and you should too. Trust us. The best way to get your music into the worlds ears is Distrokid. Artist keep 100% of their royalties and their mobile app is smartly designed, easy to use and perfectly intuitive. Please visit Izotope and Distrokid for continued exclusive listener discounts.
Weeks after being named the chief information security officer for the Defense Department, Katie Arrington was announced Monday as the Pentagon's official “Performing the Duties of the Department of Defense Chief Information Officer.” The DOD Office of the CIO announced the move by Secretary of Defense Pete Hegseth to place Arrington as the acting CIO in a post on LinkedIn. The post also confirmed that Leslie Beavers, who had been acting CIO since John Sherman left the role last June, will return to her primary role as principal deputy CIO. A defense official confirmed Arrington started in the role Monday. An organization that's filed multiple legal challenges against the Trump administration is focusing its attention on the potential use of artificial intelligence in personnel decisions. Democracy Forward, a social welfare organization, said Monday it “launched a public records investigation” into the administration's AI use, including filing requests under the Freedom of Information Act. Skye Perryman, president and CEO of Democracy Forward, said in a written statement that the American people deserve to know what is going on — including if and how artificial intelligence is being used to reshape the departments and agencies people rely on daily. The organization's requests come after reporting by NBC that Trump and Elon Musk's Department of Government Efficiency planned to use AI to review employee responses to the Office of Personnel Management's “five bullets” email. The Daily Scoop Podcast is available every Monday-Friday afternoon. If you want to hear more of the latest from Washington, subscribe to The Daily Scoop Podcast on Apple Podcasts, Soundcloud, Spotify and YouTube.
Zach and I discuss some playful 'Podcast Bros Beef' over new recording software AI's clipped edits, which accidentally made Zach appear 'insane.' We also delve into Zach's golf game progress, covering his simulator training during the off-season, swing path improvements, and balancing workouts with practice to avoid injury. The conversation transitions to a spirited debate sparked by a Twitter post from a popular online golf instructor criticizing the term 'swing rebuild,' leading to a discussion on significant versus minor swing changes. They reflect on a compelling podcast episode by Adam Young and John Sherman, which features the impressive golf journey of Will Knauth and highlights the importance of tournament play, consistent practice, and motor learning principles. Key takeaways include the relevance of fitness, unconventional employment, and the skill required to excel in varying golf environments from practice ranges to competitive play. 00:00 Podcast Bros Beef: AI Edits Gone Wrong 01:49 Golf Game Update: Swing Speed and Training 05:17 Simulator Sessions: Practice and Enjoyment 11:51 Fitness and Golf: The Energy Connection 17:37 Internet Fight: Swing Rebuild Debate 19:17 Legal Insights and Swing Rebuilds 19:36 The Subjectivity of Swing Rebuilds 19:49 Navigating Online Golf Content 21:40 Marketing vs. Education in Golf 23:27 Personal Swing Journey 25:34 Comparisons to Tiger Woods
What will 2025 bring? Sam Altman says AGI is coming in 2025. Agents will arrive for sure. Military use will expand greatly. Will we get a warning shot? Will we survive the year? In Episode #57, host John Sherman interviews AI Safety Research Engineer Max Winga about the latest in AI advances and risks and the year to come. FOR HUMANITY MONTHLY DONATION SUBSCRIPTION LINKS: $1 MONTH https://buy.stripe.com/7sI3cje3x2Zk9SodQT $10 MONTH https://buy.stripe.com/5kAbIP9Nh0Rc4y46oo $25 MONTH https://buy.stripe.com/3cs9AHf7B9nIggM4gh $100 MONTH https://buy.stripe.com/aEU007bVp7fAfcI5km Anthropic Alignment Faking Video:https://www.youtube.com/watch?v=9eXV64O2Xp8&t=1s Neil DeGrasse Tyson Video: https://www.youtube.com/watch?v=JRQDc55Aido&t=579s Max Winga's Amazing Speech:https://www.youtube.com/watch?v=kDcPW5WtD58 Get Involved! EMAIL JOHN: forhumanitypodcast@gmail.com SUPPORT PAUSE AI: https://pauseai.info/ SUPPORT STOP AI: https://www.stopai.info/about Check out our partner channel: Lethal Intelligence AI Lethal Intelligence AI - Home https://lethalintelligence.ai SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE!! https://www.youtube.com/@DoomDebates BUY STEPHEN HANSON'S BEAUTIFUL AI RISK BOOK!!! https://stephenhansonart.bigcartel.com/product/the-entity-i-couldn-t-fathom 22 Word Statement from Center for AI Safety Statement on AI Risk | CAIS https://www.safe.ai/work/statement-on-ai-risk Best Account on Twitter: AI Notkilleveryoneism Memes https://twitter.com/AISafetyMemes
FOR HUMANITY MONTHLY DONATION SUBSCRIPTION LINKS: $1 MONTH https://buy.stripe.com/7sI3cje3x2Zk9S... $10 MONTH https://buy.stripe.com/5kAbIP9Nh0Rc4y... $25 MONTH https://buy.stripe.com/3cs9AHf7B9nIgg... $100 MONTH https://buy.stripe.com/aEU007bVp7fAfc... In Episode #56, host John Sherman travels to Washington DC to lobby House and Senate staffers for AI regulation along with Felix De Simone and Louis Berman of Pause AI. We unpack what we saw and heard as we presented AI risk to the people who have the power to make real change. SUPPORT PAUSE AI: https://pauseai.info/ SUPPORT STOP AI: https://www.stopai.info/about EMAIL JOHN: forhumanitypodcast@gmail.com Check out our partner channel: Lethal Intelligence AI Lethal Intelligence AI - Home https://lethalintelligence.ai SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE!! / @doomdebates BUY STEPHEN HANSON'S BEAUTIFUL AI RISK BOOK!!! https://stephenhansonart.bigcartel.co... 22 Word Statement from Center for AI Safety Statement on AI Risk | CAIS https://www.safe.ai/work/statement-on... Best Account on Twitter: AI Notkilleveryoneism Memes / aisafetymemes
In a special episode of For Humanity: An AI Risk Podcast, host John Sherman travels to San Francisco. Episode #55 "Near Midnight in Suicide City" is a set of short pieces from our trip out west, where we met with Pause AI, Stop AI, Liron Shapira and stopped by Open AI among other events. Big, huge massive thanks to Beau Kershaw, Director of Photography, and my biz partner and best friend who made this journey with me through the work side and the emotional side of this. The work is beautiful and the days were wet and long and heavy. Thank you, Beau. SUPPORT PAUSE AI: https://pauseai.info/ SUPPORT STOP AI: https://www.stopai.info/about FOR HUMANITY MONTHLY DONATION SUBSCRIPTION LINKS: $10 MONTH https://buy.stripe.com/5kAbIP9Nh0Rc4y... $25 MONTH https://buy.stripe.com/3cs9AHf7B9nIgg... $100 MONTH https://buy.stripe.com/aEU007bVp7fAfc... EMAIL JOHN: forhumanitypodcast@gmail.com Check out our partner channel: Lethal Intelligence AI Lethal Intelligence AI - Home https://lethalintelligence.ai @lethal-intelligence-clips / @lethal-intelligence-clips
In this episode of the Fairway Performance Podcast, Shaun and John Sherman delve into the intricacies of competitive golf, discussing John's latest book, 'The Foundations of Winning Golf.' They explore the mindset required for competition, and the importance of managing expectations and external perceptions. John shares insights on how to learn from competitive experiences and develop a mental toolkit to handle pressure, emphasising that golf is as much about personal growth as it is about performance. In this conversation, Jon and Shaun delve into the mental and physical aspects of golf, emphasising the importance of developing a resilient identity, managing expectations, and the role of fitness in performance. They discuss how Stoicism can provide a framework for golfers to navigate challenges on the course, the significance of bogey avoidance, and strategies for consistent performance. The dialogue highlights the need for golfers to focus on preparation, effort, and attitude while also embracing the joy of the game.To learn more about Jon and his content and books click here Have questions, feedback or suggestions for future episode? Click here to send me a text!
This episode of Midweek Takeaway comes to you live from the Mines and Money Resourcing Tomorrow conference at the Business Design Centre in London. We speak with Marian Moroney, Non-Executive Director of Conroy Gold, and John Sherman, Chairman of Conroy Gold, about the recent assay results and developments from their gold projects in Ireland. They also delve into the importance of antimony, a critical EU raw material, and its potential at the Clontibret site. Additionally, we discuss Conroy Gold's extensive re-logging programme, covering over 30,000m of drill core to extract more comprehensive and consistent geological data. This ongoing initiative is set to inform key decisions for the next cycle of major project investment. Join us for insights into the company's exploration strategy and its role in advancing mining innovation. Disclaimer & Declaration of Interest The information, investment views, and recommendations in this podcast are provided for general information purposes only. Nothing in this podcast should be construed as a solicitation to buy or sell any financial product relating to any companies under discussion or to engage in or refrain from doing so or engaging in any other transaction. Any opinions or comments are made to the best of the knowledge and belief of the commentator but no responsibility is accepted for actions based on such opinions or comments. The commentators may or may not hold investments in the companies under discussion.
3,893 views Nov 19, 2024 For Humanity: An AI Safety PodcastIn Episode #54 John Sherman interviews Connor Leahy, CEO of Conjecture. (FULL INTERVIEW STARTS AT 00:06:46) DONATION SUBSCRIPTION LINKS: $10 MONTH https://buy.stripe.com/5kAbIP9Nh0Rc4y... $25 MONTH https://buy.stripe.com/3cs9AHf7B9nIgg... $100 MONTH https://buy.stripe.com/aEU007bVp7fAfc... EMAIL JOHN: forhumanitypodcast@gmail.com Check out Lethal Intelligence AI: Lethal Intelligence AI - Home https://lethalintelligence.ai @lethal-intelligence-clips / @lethal-intelligence-clips
On this episode of The Midweek Takeaway, we're joined by John Sherman, the newly appointed Chairman of Conroy Gold and Natural Resources, and Kevin McNulty, Senior Geologist. Together, we explore the legacy of the late Professor Richard Conroy and the future of the company under John's leadership. We dive into the exciting progress at the Discs of Gold project in Ireland, including updates on the Clontibret gold deposit, new exploration targets like Clay Lake and Creenkill, and the potential for multi-million-ounce discoveries along the district-scale gold trends. Kevin shares insights from over 30,000 meters of drill core analysis and the parallels with world-class deposits in Australia and Canada. Whether you're an investor, a geology enthusiast, or just intrigued by the mining sector, this episode offers a fascinating look at the potential for a Tier 1 resource in one of Europe's most promising gold districts. Don't miss it! Disclaimer & Declaration of Interest The information, investment views, and recommendations in this podcast are provided for general information purposes only. Nothing in this podcast should be construed as a solicitation to buy or sell any financial product relating to any companies under discussion or to engage in or refrain from doing so or engaging in any other transaction. Any opinions or comments are made to the best of the knowledge and belief of the commentator but no responsibility is accepted for actions based on such opinions or comments. The commentators may or may not hold investments in the companies under discussion.
In Episode #53 John Sherman interviews Michael DB Harvey, author of The Age of Humachines. The discussion covers the coming spectre of humans putting digital implants inside ourselves to try to compete with AI. DONATION SUBSCRIPTION LINKS: $10 MONTH https://buy.stripe.com/5kAbIP9Nh0Rc4y... $25 MONTH https://buy.stripe.com/3cs9AHf7B9nIgg... $100 MONTH https://buy.stripe.com/aEU007bVp7fAfc...
In Episode #52 , host John Sherman looks back on the first year of For Humanity. Select shows are featured as well as a very special celebration of life at the end.
In Episode #51 , host John Sherman talks with Tom Barnes, an Applied Researcher with Founders Pledge, about the reality of AI risk funding, and about the need for emergency planning for AI to be much more robust and detailed than it is now. We are currently woefully underprepared. Learn More About Founders Pledge: https://www.founderspledge.com/ No celebration of life this week!! Youtube finally got me with a copyright flag, had to edit the song out. THURSDAY NIGHTS--LIVE FOR HUMANITY COMMUNITY MEETINGS--8:30PM EST Join Zoom Meeting: https://storyfarm.zoom.us/j/816517210... Passcode: 829191 Please Donate Here To Help Promote For Humanity https://www.paypal.com/paypalme/forhu... EMAIL JOHN: forhumanitypodcast@gmail.com This podcast is not journalism. But it's not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI. Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We'll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity. **************** RESOURCES: SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE!! / @doomdebates Join the Pause AI Weekly Discord Thursdays at 2pm EST / discord / discord Max Winga's “A Stark Warning About Extinction” • A Stark Warning About AI Extinction For Humanity Theme Music by Josef Ebner Youtube: / @jpjosefpictures Website: https://josef.pictures BUY STEPHEN HANSON'S BEAUTIFUL AI RISK BOOK!!! https://stephenhansonart.bigcartel.co... 22 Word Statement from Center for AI Safety Statement on AI Risk | CAIS https://www.safe.ai/work/statement-on... Best Account on Twitter: AI Notkilleveryoneism Memes / aisafetymemes *********************** Explore the realm of AI risk funding and its potential to guide you toward achieving your goals and enhancing your well-being. Delve into the essence of big tech vs. small safety, and discover how it profoundly impacts your life transformation. In this video, we'll examine the concept of AI risk funding, explaining how it fosters a positive, growth-oriented mindset. Some of the topics we will discuss include: AI AI safety AI safety research
In Episode #50, host John Sherman talks with Deger Turan, CEO of Metaculus about what his prediction market reveals about the AI future we are all heading towards. THURSDAY NIGHTS--LIVE FOR HUMANITY COMMUNITY MEETINGS--8:30PM EST Join Zoom Meeting: https://storyfarm.zoom.us/j/816517210... Passcode: 829191 LEARN MORE– www.metaculus.com Please Donate Here To Help Promote For Humanity https://www.paypal.com/paypalme/forhu... EMAIL JOHN: forhumanitypodcast@gmail.com This podcast is not journalism. But it's not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI. Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We'll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity. **************** RESOURCES: SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE!! / @doomdebates Join the Pause AI Weekly Discord Thursdays at 2pm EST / discord / discord Max Winga's “A Stark Warning About Extinction” • A Stark Warning About AI Extinction For Humanity Theme Music by Josef Ebner Youtube: / @jpjosefpictures Website: https://josef.pictures BUY STEPHEN HANSON'S BEAUTIFUL AI RISK BOOK!!! https://stephenhansonart.bigcartel.co... 22 Word Statement from Center for AI Safety Statement on AI Risk | CAIS https://www.safe.ai/work/statement-on... Best Account on Twitter: AI Notkilleveryoneism Memes / aisafetymemes ********************** Hi, thanks for watching our video about what insight can Metaculus reveal about AI risk and accurately predicting doom. In this video, we discuss accurately predicting doom and cover the following topics AI AI safety Metaculus ********************** Explore our other video content here on YouTube, where you'll find more insights into accurately predicting doom, along with relevant social media links. YouTube: / @forhumanitypodcast Website: http://www.storyfarm.com/ *************************** This video explores accurately predicting doom, AI, AI safety, and Metaculus. Have I addressed your curiosity regarding accurately predicting doom? We eagerly await your feedback and insights. Please drop a comment below, sharing your thoughts, queries, or suggestions about: AI, AI safety, Metaculus, and accurately predicting doom.
In Episode #51 Trailer, host John Sherman talks with Tom Barnes, an Applied Researcher with Founders Pledge, about the reality of AI risk funding, and about the need for emergency planning for AI to be much more robust and detailed than it is now. We are currently woefully underprepared. Learn More About Founders Pledge: https://www.founderspledge.com/ THURSDAY NIGHTS--LIVE FOR HUMANITY COMMUNITY MEETINGS--8:30PM EST Join Zoom Meeting: https://storyfarm.zoom.us/j/816517210... Passcode: 829191 Please Donate Here To Help Promote For Humanity https://www.paypal.com/paypalme/forhu... EMAIL JOHN: forhumanitypodcast@gmail.com This podcast is not journalism. But it's not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI. Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We'll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity. **************** RESOURCES: SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE!! / @doomdebates Join the Pause AI Weekly Discord Thursdays at 2pm EST / discord / discord Max Winga's “A Stark Warning About Extinction” • A Stark Warning About AI Extinction For Humanity Theme Music by Josef Ebner Youtube: / @jpjosefpictures Website: https://josef.pictures BUY STEPHEN HANSON'S BEAUTIFUL AI RISK BOOK!!! https://stephenhansonart.bigcartel.co... 22 Word Statement from Center for AI Safety Statement on AI Risk | CAIS https://www.safe.ai/work/statement-on... Best Account on Twitter: AI Notkilleveryoneism Memes / aisafetymemes *********************** Explore the realm of AI risk funding and its potential to guide you toward achieving your goals and enhancing your well-being. Delve into the essence of big tech vs. small safety, and discover how it profoundly impacts your life transformation. In this video, we'll examine the concept of AI risk funding, explaining how it fosters a positive, growth-oriented mindset. Some of the topics we will discuss include: AI What is AI? Big tech *************************** If you want to learn more about AI risk funding, follow us on our social media platforms, where we share additional tips, resources, and stories. You can find us on YouTube: / @forhumanitypodcast Website: http://www.storyfarm.com/ *************************** Don't miss this opportunity to discover the secrets of AI risk funding, AI, what is AI, and big tech. Have I addressed your concerns about AI risk funding? Maybe you wish to comment below and let me know what else I can help you with AI, what is AI, big tech, and AI risk funding.
On Monday, September 17, the INSA Foundation in partnership with ClearanceJobs and GDIT, hosted the second installment of the "Future of the IC Workforce: Technology and Talent Transformation" series. Speakers included Kimberly King, Career Service Manager for Analysis, DIA, The Hon. John Sherman, Dean, The Bush School of Government and Public Service, Texas A&M University, and moderator Lindy Kyzer, Director of Content and PR, ClearnaceJobs. The program opened with speakers discussing the unique values that national security careers offer. Ms. King stated that there is merit in safeguarding the nation by being a part of something bigger than yourself. Joining the workforce there is an opportunity for “combination of mission and chance to drive your own career,” said Ms. King. Dean Sherman reflected on his career pivots and being at the forefront of technological advancement over the years. He noted that “being a trailblazer” in a discipline is distinctive from other career paths.The conversation shifted to the private sector's competitive strategy to retain and attract new talent. Ms. King highlighted DIA is building their talent pipeline through IC Centers for Academic Excellence. This initiative leverages DIA's connection with students across the country to onboard internships. The pay gap between the private sector and government presents a complex challenge for recruiting stem talent. Ms. King revealed the agency's new pay model is a proactive recruiting strategy that is attracting stem students to the DIA. Mr. Sherman responded, that students at the Bush School of Government and Public Service Texas A&M are attracted to the national security mission. To transform mission-focused students to employees; the Bush School of Government and Public Service hosts professors of the practice that extend “tangible real-world example” of careers in the workforce.Upskilling and training are focus areas of the DIA to ensure that employees are digitally- literature on emerging technologies. Kim noted that across-agency there are formal and informal training for employees. Continuing learning opportunities at DIA consist of technical training at universities, senior service schools, and speaker visits. Mr. Sherman urged that academia prepare the future workforce by enforcing effective and concise communication amongst students. “Getting them in a mindset that they're writing decision-makers,” said Mr. Sherman. The ability to write clearly and brief a policy is a critical skill the workforce values.The speakers agreed that to reach the future workforce there is a need for a flexible workspace. Improving retention and attraction strategies relies on the workforce to effectively track talent. Insufficient communication with applicants is resulting in the workforce losing skilled candidates during the process. Hosted on Acast. See acast.com/privacy for more information.
In Episode #50 TRAILER, host John Sherman talks with Deger Turan, CEO of Metaculus about what his prediction market reveals about the AI future we are all heading towards. LEARN MORE–AND JOIN STOP AI www.stopai.info Please Donate Here To Help Promote For Humanity https://www.paypal.com/paypalme/forhu... EMAIL JOHN: forhumanitypodcast@gmail.com This podcast is not journalism. But it's not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI. Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We'll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity.
In Episode #49, host John Sherman talks with Sam Kirchner and Remmelt Ellen, co-founders of Stop AI. Stop AI is a new AI risk protest organization, coming at it with different tactics and goals than Pause AI. LEARN MORE–AND JOIN STOP AI www.stopai.info Please Donate Here To Help Promote For Humanity https://www.paypal.com/paypalme/forhumanitypodcast EMAIL JOHN: forhumanitypodcast@gmail.com This podcast is not journalism. But it's not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI. Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We'll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity. RESOURCES: SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE!! https://www.youtube.com/@DoomDebates Join the Pause AI Weekly Discord Thursdays at 2pm EST / discord https://discord.com/invite/pVMWjddaW7 Max Winga's “A Stark Warning About Extinction” https://youtu.be/kDcPW5WtD58?si=i6IRy82xZ2PUOp22 For Humanity Theme Music by Josef Ebner Youtube: https://www.youtube.com/channel/UCveruX8E-Il5A9VMC-N4vlg Website: https://josef.pictures BUY STEPHEN HANSON'S BEAUTIFUL AI RISK BOOK!!! https://stephenhansonart.bigcartel.com/product/the-entity-i-couldn-t-fathom 22 Word Statement from Center for AI Safety Statement on AI Risk | CAIS https://www.safe.ai/work/statement-on-ai-risk Best Account on Twitter: AI Notkilleveryoneism Memes https://twitter.com/AISafetyMemes
In Episode #48, host John Sherman talks with Pause AI US Founder Holly Elmore about the limiting origins of the AI safety movement. Polls show 60-80% of the public are opposed to building artificial superintelligence. So why is the movement to stop it still so small? The roots of the AI safety movement have a lot to do with it. Holly and John explore the present day issues created by the movements origins. Let's build community! Live For Humanity Zoom Community Meeting via Zoom Thursdays at 8:30pm EST...explanation during the full show! USE THIS THINK: https://storyfarm.zoom.us/j/88987072403 PASSCODE: 789742 LEARN HOW TO HELP RAISE AI RISK AWARENESS IN YOUR COMMUNITY HERE https://pauseai.info/local-organizing Please Donate Here To Help Promote For Humanity https://www.paypal.com/paypalme/forhu... EMAIL JOHN: forhumanitypodcast@gmail.com This podcast is not journalism. But it's not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI. Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We'll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity. RESOURCES: JOIN THE FIGHT, help Pause AI!!!! Pause AI SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE!! / @doomdebates Join the Pause AI Weekly Discord Thursdays at 2pm EST / discord / discord Max Winga's “A Stark Warning About Extinction” • A Stark Warning About AI Extinction For Humanity Theme Music by Josef Ebner Youtube: / @jpjosefpictures Website: https://josef.pictures BUY STEPHEN HANSON'S BEAUTIFUL AI RISK BOOK!!! https://stephenhansonart.bigcartel.co... 22 Word Statement from Center for AI Safety Statement on AI Risk | CAIS https://www.safe.ai/work/statement-on... Best Account on Twitter: AI Notkilleveryoneism Memes / aisafetymemes ************************* Welcome! In today's video, we delve into the vital aspects of AI safety movement and explore what is the origin of AI safety. This video covers what is the origin of AI safety and the following topics: AI safety AI safety research Eliezer's insights on AI safety research ******************** Discover more of our video content on what is the origin of AI safety. You'll find additional insights on this topic along with relevant social media links. YouTube: / @forhumanitypodcast
In Episode #49 TRAILER, host John Sherman talks with Sam Kirchner and Remmelt Ellen, co-founders of Stop AI. Stop AI is a new AI risk protest organization, coming at it with different tactics and goals than Pause AI. LEARN MORE–AND JOIN STOP AI www.stopai.info Please Donate Here To Help Promote For Humanity https://www.paypal.com/paypalme/forhu... EMAIL JOHN: forhumanitypodcast@gmail.com ********************* This podcast is not journalism. But it's not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI. Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We'll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity. ********************* RESOURCES: SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE!! / @doomdebates Join the Pause AI Weekly Discord Thursdays at 2pm EST / discord / discord Max Winga's “A Stark Warning About Extinction” • A Stark Warning About AI Extinction For Humanity Theme Music by Josef Ebner Youtube: / @jpjosefpictures Website: https://josef.pictures BUY STEPHEN HANSON'S BEAUTIFUL AI RISK BOOK!!! https://stephenhansonart.bigcartel.co... 22 Word Statement from Center for AI Safety Statement on AI Risk | CAIS https://www.safe.ai/work/statement-on... Best Account on Twitter: AI Notkilleveryoneism Memes / aisafetymemes PayPal.MePayPal.Me Pay John Sherman using PayPal.Me Go to paypal.me/forhumanitypodcast and type in the amount. Since it's PayPal, it's easy and secure. Don't have a PayPal account? No worries. PayPal.MePayPal.Me Pay John Sherman using PayPal.Me Go to paypal.me/forhumanitypodcast and type in the amount. Since it's PayPal, it's easy and secure. Don't have a PayPal account? No worries. YouTubeYouTube Doom Debates Urgent disagreements that must be resolved before the world ends. Hosted by Liron Shapira. Discord Join the PauseAI Discord Server! Community of volunteers working towards an international pause on the development of AI systems more powerful than GPT-4 | 2077 members (93 kB) ************* Welcome! In today's video, we delve into the vital aspects of stopping AI and explore go to jail to stop AI. This video covers go to jail to stop AI and the following topics: AI safety AI risks AI legal issues ******************** Discover more of our video content on go to jail to stop AI. You'll find additional insights on this topic along with relevant social media links. YouTube: / @forhumanitypodcast Website: http://www.storyfarm.com/ *************************** This video explores go to jail to stop AI, AI safety, AI risks, and AI legal issues. Have I addressed your curiosity regarding go to jail to stop AI? We eagerly await your feedback and insights. Please drop a comment below, sharing your thoughts, queries, or suggestions about: AI safety, AI risks, AI legal issues, and go to jail to stop AI.
Hour 1 – The Drive opened the show discussing how John Sherman came thru on his promise he made at the resignation of Dayton Moore, by giving the city a playoff series.
In Episode #48 Trailer, host John Sherman talks with Pause AI US Founder Holly Elmore about the limiting origins of the AI safety movement. Polls show 60-80% of the public are opposed to building artificial superintelligence. So why is the movement to stop it still so small? The roots of the AI safety movement have a lot to do with it. Holly and John explore the present day issues created by the movements origins. Let's build community! Live For Humanity Zoom Community Meeting via Zoom Thursdays at 8:30pm EST...explanation during the full show! USE THIS THINK: https://storyfarm.zoom.us/j/88987072403 LEARN HOW TO HELP RAISE AI RISK AWARENESS IN YOUR COMMUNITY HERE https://pauseai.info/local-organizing Please Donate Here To Help Promote For Humanity https://www.paypal.com/paypalme/forhumanitypodcast EMAIL JOHN: forhumanitypodcast@gmail.com This podcast is not journalism. But it's not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI. Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We'll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity. RESOURCES: JOIN THE FIGHT, help Pause AI!!!! Pause AI SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE!! https://www.youtube.com/@DoomDebates Join the Pause AI Weekly Discord Thursdays at 2pm EST / discord https://discord.com/invite/pVMWjddaW7 Max Winga's “A Stark Warning About Extinction” https://youtu.be/kDcPW5WtD58?si=i6IRy82xZ2PUOp22 For Humanity Theme Music by Josef Ebner Youtube: https://www.youtube.com/channel/UCveruX8E-Il5A9VMC-N4vlg Website: https://josef.pictures BUY STEPHEN HANSON'S BEAUTIFUL AI RISK BOOK!!! https://stephenhansonart.bigcartel.com/product/the-entity-i-couldn-t-fathom 22 Word Statement from Center for AI Safety Statement on AI Risk | CAIS https://www.safe.ai/work/statement-on-ai-risk Best Account on Twitter: AI Notkilleveryoneism Memes https://twitter.com/AISafetyMemes
In Episode #47, host John Sherman talks with Buck Shlegeris, CEO of Redwood Research, a non-profit company working on technical AI risk challenges. The discussion includes Buck's thoughts on the new OpenAI o1-preview model, but centers on two questions: is there a way to control AI models before alignment is achieved if it can be, and how would the system that's supposed to save the world actually work if an AI lab found a model scheming. Check out these links to Buck's writing on these topics below: https://redwoodresearch.substack.com/p/the-case-for-ensuring-that-powerful https://redwoodresearch.substack.com/p/would-catching-your-ais-trying-to Senate Hearing: https://www.judiciary.senate.gov/committee-activity/hearings/oversight-of-ai-insiders-perspectives Harry Macks Youtube Channel https://www.youtube.com/channel/UC59ZRYCHev_IqjUhremZ8Tg LEARN HOW TO HELP RAISE AI RISK AWARENESS IN YOUR COMMUNITY HERE https://pauseai.info/local-organizing Please Donate Here To Help Promote For Humanity https://www.paypal.com/paypalme/forhumanitypodcast EMAIL JOHN: forhumanitypodcast@gmail.com This podcast is not journalism. But it's not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI. Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We'll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity. RESOURCES: JOIN THE FIGHT, help Pause AI!!!! Pause AI SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE!! https://www.youtube.com/@DoomDebates Join the Pause AI Weekly Discord Thursdays at 2pm EST / discord https://discord.com/invite/pVMWjddaW7 Max Winga's “A Stark Warning About Extinction” https://youtu.be/kDcPW5WtD58?si=i6IRy82xZ2PUOp22 For Humanity Theme Music by Josef Ebner Youtube: https://www.youtube.com/channel/UCveruX8E-Il5A9VMC-N4vlg Website: https://josef.pictures BUY STEPHEN HANSON'S BEAUTIFUL AI RISK BOOK!!! https://stephenhansonart.bigcartel.com/product/the-entity-i-couldn-t-fathom 22 Word Statement from Center for AI Safety Statement on AI Risk | CAIS https://www.safe.ai/work/statement-on-ai-risk Best Account on Twitter: AI Notkilleveryoneism Memes https://twitter.com/AISafetyMemes
In Episode #47 Trailer, host John Sherman talks with Buck Shlegeris, CEO of Redwood Research, a non-profit company working on technical AI risk challenges. The discussion includes Buck's thoughts on the new OpenAI o1-preview model, but centers on two questions: is there a way to control AI models before alignment is achieved if it can be, and how would the system that's supposed to save the world actually work if an AI lab found a model scheming. Check out these links to Buck's writing on these topics below: https://redwoodresearch.substack.com/p/the-case-for-ensuring-that-powerful https://redwoodresearch.substack.com/p/would-catching-your-ais-trying-to Senate Hearing: https://www.judiciary.senate.gov/committee-activity/hearings/oversight-of-ai-insiders-perspectives Harry Macks Youtube Channel https://www.youtube.com/channel/UC59ZRYCHev_IqjUhremZ8Tg LEARN HOW TO HELP RAISE AI RISK AWARENESS IN YOUR COMMUNITY HERE https://pauseai.info/local-organizing Please Donate Here To Help Promote For Humanity https://www.paypal.com/paypalme/forhumanitypodcast EMAIL JOHN: forhumanitypodcast@gmail.com This podcast is not journalism. But it's not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI. Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We'll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity. RESOURCES: JOIN THE FIGHT, help Pause AI!!!! Pause AI SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE!! https://www.youtube.com/@DoomDebates Join the Pause AI Weekly Discord Thursdays at 2pm EST / discord https://discord.com/invite/pVMWjddaW7 Max Winga's “A Stark Warning About Extinction” https://youtu.be/kDcPW5WtD58?si=i6IRy82xZ2PUOp22 For Humanity Theme Music by Josef Ebner Youtube: https://www.youtube.com/channel/UCveruX8E-Il5A9VMC-N4vlg Website: https://josef.pictures BUY STEPHEN HANSON'S BEAUTIFUL AI RISK BOOK!!! https://stephenhansonart.bigcartel.com/product/the-entity-i-couldn-t-fathom 22 Word Statement from Center for AI Safety Statement on AI Risk | CAIS https://www.safe.ai/work/statement-on-ai-risk Best Account on Twitter: AI Notkilleveryoneism Memes https://twitter.com/AISafetyMemes
In Episode #46, host John Sherman talks with Daniel Faggella, Founder and Head of Research at Emerj Artificial Intelligence Research. Dan has been speaking out about AI risk for a long time but comes at it from a different perspective than many. Dan thinks we need to talk about how we can make AGI and whatever comes after become humanity's worthy successor. More About Daniel Faggella https://danfaggella.com/ LEARN HOW TO HELP RAISE AI RISK AWARENESS IN YOUR COMMUNITY HERE https://pauseai.info/local-organizing Please Donate Here To Help Promote For Humanity https://www.paypal.com/paypalme/forhumanitypodcast EMAIL JOHN: forhumanitypodcast@gmail.com This podcast is not journalism. But it's not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI. Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We'll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity. RESOURCES: JOIN THE FIGHT, help Pause AI!!!! Pause AI SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE!! https://www.youtube.com/@DoomDebates Join the Pause AI Weekly Discord Thursdays at 2pm EST / discord https://discord.com/invite/pVMWjddaW7 Max Winga's “A Stark Warning About Extinction” https://youtu.be/kDcPW5WtD58?si=i6IRy82xZ2PUOp22 For Humanity Theme Music by Josef Ebner Youtube: https://www.youtube.com/channel/UCveruX8E-Il5A9VMC-N4vlg Website: https://josef.pictures BUY STEPHEN HANSON'S BEAUTIFUL AI RISK BOOK!!! https://stephenhansonart.bigcartel.com/product/the-entity-i-couldn-t-fathom 22 Word Statement from Center for AI Safety Statement on AI Risk | CAIS https://www.safe.ai/work/statement-on-ai-risk Best Account on Twitter: AI Notkilleveryoneism Memes https://twitter.com/AISafetyMemes
In Episode #46 Trailer, host John Sherman talks with Daniel Faggella, Founder and Head of Research at Emerj Artificial Intelligence Research. Dan has been speaking out about AI risk for a long time but comes at it from a different perspective than many. Dan thinks we need to talk about how we can make AGI and whatever comes after become humanity's worthy successor. More About Daniel Faggella https://danfaggella.com/ LEARN HOW TO HELP RAISE AI RISK AWARENESS IN YOUR COMMUNITY HERE https://pauseai.info/local-organizing Please Donate Here To Help Promote For Humanity https://www.paypal.com/paypalme/forhumanitypodcast EMAIL JOHN: forhumanitypodcast@gmail.com This podcast is not journalism. But it's not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI. Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We'll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity. RESOURCES: JOIN THE FIGHT, help Pause AI!!!! Pause AI SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE!! https://www.youtube.com/@DoomDebates Join the Pause AI Weekly Discord Thursdays at 2pm EST / discord https://discord.com/invite/pVMWjddaW7 Max Winga's “A Stark Warning About Extinction” https://youtu.be/kDcPW5WtD58?si=i6IRy82xZ2PUOp22 For Humanity Theme Music by Josef Ebner Youtube: https://www.youtube.com/channel/UCveruX8E-Il5A9VMC-N4vlg Website: https://josef.pictures BUY STEPHEN HANSON'S BEAUTIFUL AI RISK BOOK!!! https://stephenhansonart.bigcartel.com/product/the-entity-i-couldn-t-fathom 22 Word Statement from Center for AI Safety Statement on AI Risk | CAIS https://www.safe.ai/work/statement-on-ai-risk Best Account on Twitter: AI Notkilleveryoneism Memes https://twitter.com/AISafetyMemes
In Episode #45, host John Sherman talks with Dr. Mike Brooks, a Psychologist focusing on kids and technology. The conversation is broad-ranging, touching on parenting, happiness and screens, the need for human unity, and the psychology of humans facing an ever more unknown future.FULL INTERVIEW STARTS AT (00:05:28) Mike's book: Tech Generation: Raising Balanced Kids in a Hyper-Connected World An article from Mike in Psychology Today: The Happiness Illusion: Facing the Dark Side of Progress Fine Dr. Brooks on Social Media LinkedIn | X/Twitter | YouTube | TikTok | Instagram | Facebook https://www.linkedin.com/in/dr-mike-brooks-b1164120 https://x.com/drmikebrooks https://www.youtube.com/@connectwithdrmikebrooks https://www.tiktok.com/@connectwithdrmikebrooks?lang=en https://www.instagram.com/drmikebrooks/?hl=en Chris Gerrby's Twitter: https://x.com/ChrisGerrby LEARN HOW TO HELP RAISE AI RISK AWARENESS IN YOUR COMMUNITY HERE https://pauseai.info/local-organizing Please Donate Here To Help Promote For Humanity https://www.paypal.com/paypalme/forhumanitypodcast EMAIL JOHN: forhumanitypodcast@gmail.com This podcast is not journalism. But it's not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI. Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We'll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity. RESOURCES: JOIN THE FIGHT, help Pause AI!!!! Pause AI SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE!! https://www.youtube.com/@DoomDebates Join the Pause AI Weekly Discord Thursdays at 2pm EST / discord https://discord.com/invite/pVMWjddaW7 Max Winga's “A Stark Warning About Extinction” https://youtu.be/kDcPW5WtD58?si=i6IRy82xZ2PUOp22 For Humanity Theme Music by Josef Ebner Youtube: https://www.youtube.com/channel/UCveruX8E-Il5A9VMC-N4vlg Website: https://josef.pictures BUY STEPHEN HANSON'S BEAUTIFUL AI RISK BOOK!!! https://stephenhansonart.bigcartel.com/product/the-entity-i-couldn-t-fathom 22 Word Statement from Center for AI Safety Statement on AI Risk | CAIS https://www.safe.ai/work/statement-on-ai-risk Best Account on Twitter: AI Notkilleveryoneism Memes https://twitter.com/AISafetyMemes
In Episode #45 TRAILER, host John Sherman talks with Dr. Mike Brooks, a Psychologist focusing on kids and technology. The conversation is broad-ranging, touching on parenting, happiness and screens, the need for human unity, and the psychology of humans facing an ever more unknown future. Mike's book: Tech Generation: Raising Balanced Kids in a Hyper-Connected World An article from Mike in Psychology Today: The Happiness Illusion: Facing the Dark Side of Progress LEARN HOW TO HELP RAISE AI RISK AWARENESS IN YOUR COMMUNITY HERE https://pauseai.info/local-organizing Please Donate Here To Help Promote For Humanity https://www.paypal.com/paypalme/forhumanitypodcast EMAIL JOHN: forhumanitypodcast@gmail.com This podcast is not journalism. But it's not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI. Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We'll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity. RESOURCES: JOIN THE FIGHT, help Pause AI!!!! Pause AI SUBSCRIBE TO LIRON SHAPIRA'S DOOM DEBATES on YOUTUBE!! https://www.youtube.com/@DoomDebates Join the Pause AI Weekly Discord Thursdays at 2pm EST / discord https://discord.com/invite/pVMWjddaW7 Max Winga's “A Stark Warning About Extinction” https://youtu.be/kDcPW5WtD58?si=i6IRy82xZ2PUOp22 For Humanity Theme Music by Josef Ebner Youtube: https://www.youtube.com/channel/UCveruX8E-Il5A9VMC-N4vlg Website: https://josef.pictures BUY STEPHEN HANSON'S BEAUTIFUL AI RISK BOOK!!! https://stephenhansonart.bigcartel.com/product/the-entity-i-couldn-t-fathom 22 Word Statement from Center for AI Safety Statement on AI Risk | CAIS https://www.safe.ai/work/statement-on-ai-risk Best Account on Twitter: AI Notkilleveryoneism Memes https://twitter.com/AISafetyMemes
Hour 1 – The Drive discussed what John Sherman said to Josh Vernier about the Royals attendance and why fans are slow to return to the ballpark.
The Drive discussed how John Sherman approach to the upcoming deadline is well reasoned and the practical approach.
Owner of the Royals John Sherman sat down with our own Royals insider Josh Vernier and discussed everything from the roster to the stadium before we dive into NFL offseason discussion like how long the season should be and discuss the best 1-2 punch cities when it comes to athletes.
Our own Royals insider Josh Vernier sat down with KC Royals owner John Sherman and talked about this team and this season as well as what is to be expected with the ball club. We then hear reaction from part 1 of the interview from Kling and Dusty.
Our own Josh Vernier sat down with Kansas City Royals owner John Sherman and talked the present and future plans for the club. We hear part 2 as well as reaction from Kling and Dusty.
BWJ made the HR Derby! We have our limited time segment shout it out with all of you and your texts, then, we hear part 1 of Josh Verniers interview with KC Royals owner John Sherman.
We hear Klings and Dustys reaction to part 1 of Vern speaking with Royals owner John Sherman before discussing MLB free agency and good future value when it comes to NFL QB's.
We continue to listen and react to Josh Vernier's interview with Royals owner John Sherman before taking a closer look at some sports and their athletes.
Hey, at least the Chiefs have Tuesdays off! That's right... the Chiefs will have a game on all other days of the week this year except Tuesday on a whopping NINE different tv networks. It's just crazy how in demand this team is as the schedule is released. Super Bowl hero Harrison Butker isn't feeling the love all his other teammates are right now but we're pretty sure he can handle it. Butker is being attacked all over the country for his outstanding commencement speech at Benedictine College and that includes the NFL itself. The league saw a need to have a diversity office release a statement that they don't believe in Butker's kind of diversity. Royals owner John Sherman sends a terse letter to Jackson County, Caitlin Clark loses millions and millions of tv viewers, Jake Tapper is a bad draw for Trump at the debate in June and a Democrat Congressman calls for cancelling the Democrat's convention.