Podcast appearances and mentions of Kevin Frazier

  • 109PODCASTS
  • 333EPISODES
  • 42mAVG DURATION
  • 5WEEKLY NEW EPISODES
  • Dec 26, 2025LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about Kevin Frazier

Latest podcast episodes about Kevin Frazier

The Lawfare Podcast
Lawfare Daily: The Year That Was: 2025

The Lawfare Podcast

Play Episode Listen Later Dec 26, 2025 55:34


Every year, Lawfare publishes a retrospective of the year that passed. Today, we're pleased to bring you an audio debrief of that article, The Year That Was: 2025, which you can read in full on our website starting December 31.Lawfare is focused on producing timely, rigorous, and non-partisan analysis of “hard national security choices.” And this year, that work was—to use an expression as tired as we are—like drinking from a firehose. We did our best to keep up. We published more than 1,000 articles, podcasts, videos, research papers, and primary source documents. We did livestream round-ups and rapid-response videos. We produced five different podcasts and an investigative video series. We built data visualizations and trackers to make sense of complicated unfolding events. You can find all that and more for free on our website, lawfaremedia.org.It's impossible to capture everything that happened in 2025 in the world of national security. But here's what stood out to the Lawfare team—and what they have to say about. In this episode, you'll hear from Executive Editor Natalie Orpett on Lawfare's work in 2025 and from Editor in Chief Benjamin Wittes on The Situation. You'll hear from Senior Editors Anna Bower on DOGE, Roger Parloff on the Alien Enemies Act, Molly Roberts on politicization of the Justice Department, Eric Columbus on impoundments, Scott R. Anderson on war powers, and Kevin Frazier on AI and the states. You'll hear from Public Interest Fellows Loren Voss on domestic deployments of the military, and Ariane Tabatabai on foreign policy. You'll hear from our Managing Editor, Tyler McBrien, on our narrative podcast series, Escalation. You'll hear from Associate Editors Katherine Pompilio on the Jan. 6 pardons and Olivia Manes on rolling back internal checks at the Justice Department. You'll hear from our Fellow Jakub Kraus on AI, and you'll hear from Contributing Editor Renée DiResta on election integrity capacity.And that's just a sampling of Lawfare's work.It's The Year That Was: 2025. We'll see you next year.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

The John Batchelor Show
S8 Ep237: SHOW 12-23-25 THE SHOW BEGINS WITH DOUBTS F THE EU... EU STRUGGLES WITH RUSSIAN ASSETS AND AID Colleague Judy Dempsey. Judy Dempsey discusses the EU's difficulty in utilizing frozen Russian assets and the "defeat" for Chancellor Merz

The John Batchelor Show

Play Episode Listen Later Dec 24, 2025 6:34


SHOW 12-23-25 THE SHOW BEGINS WITH DOUBTS F THE EU... 1831 BRUSSELS EU STRUGGLES WITH RUSSIAN ASSETS AND AID Colleague Judy Dempsey. Judy Dempsey discusses the EU's difficulty in utilizing frozen Russian assets and the "defeat" for Chancellor Merz regarding the funding mechanism for Ukraine. NUMBER 1 THE RISE OF THE AFD IN GERMANY Colleague Judy Dempsey. Judy Dempsey continues, focusing on the rise of the AfD party in Germany and its connections to elements of the US Republican party. NUMBER 2 STALEMATES IN GAZA AND LEBANON Colleague Jonathan Schanzer. Jonathan Schanzer discusses the stalemate regarding the last hostage in Gaza, the fragmented control of the territory, and threats in Lebanon and Syria. NUMBER 3 EU REGULATION VS. US GROWTH Colleague Michael Toth. Michael Toth critiques the European Union's "regulatory imperialism" and contrasts it with the economic growth of the US. NUMBER 4 STATE DEPARTMENT RECALLS AND STRATEGY Colleague Mary Kissel. Mary Kissel discusses the recall of career ambassadors by the Trump administration and challenges in Panama and Greenland. NUMBER 5 AUSTRALIA'S DEFENSE AND CHINA Colleague Grant Newsham. Grant Newsham warns about Australia's lack of defense capabilities and the erosion of its influence in the Pacific islands due to Chinese political warfare. NUMBER 6 THE BORING BENEFITS OF AI Colleague Kevin Frazier. Kevin Frazier advocates for the "boring use cases" of AI, such as in healthcare and traffic management, to save costs and improve efficiency. NUMBER 7 REGULATING ARTIFICIAL INTELLIGENCE Colleague Kevin Frazier. Kevin Frazier continues, warning against a "waterfall of regulation" by states and advocating for "regulatory sandboxes" to allow experimentation. NUMBER 8 US EXPANSIONISM AND DIPLOMATIC RIFTS Colleague Gregory Copley. Gregory Copley analyzes US foreign policy moves regarding Greenland, Panama, and Venezuela, describing them as a return to "might is right" expansionism. NUMBER 9 THE MONROE DOCTRINE AND NAVAL POWER Colleague Gregory Copley. Gregory Copley continues, debating whether the US is a naval or continental power in the context of enforcing the Monroe Doctrine and discussing a proposal for new battleships. NUMBER 10 THE DECLINE OF LITERACY AND CONTEXT Colleague Gregory Copley. Gregory Copley continues, discussing the decline of literacy and context since the mid-20th century, comparing modern society to the Eloi and Morlocks of H.G. Wells. NUMBER 11 KING CHARLES III AND UK POLITICAL TURMOIL Colleague Gregory Copley. Gregory Copley continues, analyzing the challenges King Charles III faces under the Keir Starmer government, which Copley compares to the era of Oliver Cromwell. NUMBER 12 THE LEGEND OF THE HESSIANS Colleague Professor Richard Bell. Professor Richard Bell discusses the American fear of Hessian soldiers and Washington's strategic victory at Trenton. NUMBER 13 FRANCE'S GLOBAL STRATEGY IN THE REVOLUTION Colleague Professor Richard Bell. Professor Richard Bell continues, highlighting the role of Foreign Minister Vergennes and how French involvement expanded the war globally. NUMBER 14 BENEDICT ARNOLD AND PEGGY SHIPPEN Colleague Professor Richard Bell. Professor Richard Bell continues, discussing Peggy Shippen's influence on Benedict Arnold's defection and their subsequent life in London. NUMBER 15 THE ACCIDENTAL COLONIZATION OF AUSTRALIA Colleague Professor Richard Bell. Professor Richard Bell concludes, recounting the story of convict William Murray and the accidental selection of Australia as a penal colony following the loss of the American colonies. NUMBER 16

The John Batchelor Show
S8 Ep235: THE BORING BENEFITS OF AI Colleague Kevin Frazier. Kevin Frazier advocates for the "boring use cases" of AI, such as in healthcare and traffic management, to save costs and improve efficiency. NUMBER 7

The John Batchelor Show

Play Episode Listen Later Dec 24, 2025 10:50


THE BORING BENEFITS OF AI Colleague Kevin Frazier. Kevin Frazier advocates for the "boring use cases" of AI, such as in healthcare and traffic management, to save costs and improve efficiency. NUMBER 7 JANUARY 1951

The John Batchelor Show
S8 Ep235: REGULATING ARTIFICIAL INTELLIGENCE Colleague Kevin Frazier. Kevin Frazier continues, warning against a "waterfall of regulation" by states and advocating for "regulatory sandboxes" to allow experimentation. NUMBER 8

The John Batchelor Show

Play Episode Listen Later Dec 24, 2025 8:50


REGULATING ARTIFICIAL INTELLIGENCE Colleague Kevin Frazier. Kevin Frazier continues, warning against a "waterfall of regulation" by states and advocating for "regulatory sandboxes" to allow experimentation. NUMBER 8 NOVEMBER 1955

The John Batchelor Show
S8 Ep233: PREVIEW WARNING AGAINST FRAGMENTED STATE-LEVEL AI REGULATION Colleague Kevin Frazier. Kevin Frazier, a University of Texas Law School fellow, warns against fragmented AI regulation by individual states seeking tax revenue. He advocates for a nat

The John Batchelor Show

Play Episode Listen Later Dec 23, 2025 2:16


PREVIEW WARNING AGAINST FRAGMENTED STATE-LEVEL AI REGULATION Colleague Kevin Frazier. Kevin Frazier, a University of Texas Law School fellow, warns against fragmented AI regulation by individual states seeking tax revenue. He advocates for a national framework rather than hasty local laws, arguing that allowing technology to develop through "trial and error" is superior to heavy-handed, immediate restrictions.

Arbiters of Truth
Rapid Response on the AI Preemption Executive Order

Arbiters of Truth

Play Episode Listen Later Dec 12, 2025 56:12


In this rapid response episode, Lawfare senior editors Alan Rozenshtein and Kevin Frazier and Lawfare Tarbell fellow Jakub Kraus discuss President Trump's new executive order on federal preemption of state AI laws, the politics of AI regulation and the split between Silicon Valley Republicans and MAGA populists, and the administration's decision to allow Nvidia to export H200 chips to China. Mentioned in this episode:Executive Order: Ensuring a National Policy Framework for Artificial IntelligenceCharlie Bullock, "Legal Issues Raised by the Proposed Executive Order on AI Preemption," Institute for Law & AI Hosted on Acast. See acast.com/privacy for more information.

Arbiters of Truth
Graham Dufault on small businesses and navigating EU AI laws

Arbiters of Truth

Play Episode Listen Later Dec 9, 2025 45:17


Graham Dufault, General Counsel at ACT | The App Association, joins Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to explore how small- and medium-sized enterprises (SMEs) are navigating the EU's AI regulatory framework. The duo breakdown the Association's recent survey of SMEs, which included the views of more than 1,000 enterprises and assessed their views on regulation and adoption of AI. Follow Graham: @GDufault and ACT | The App Association: @actonline Hosted on Acast. See acast.com/privacy for more information.

The Lawfare Podcast
Scaling Laws: Caleb Withers on the Cybersecurity Frontier in the Age of AI

The Lawfare Podcast

Play Episode Listen Later Dec 5, 2025 49:00


Caleb Withers, a researcher at the Center for a New American Security, joins Kevin Frazier, the AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to discuss how frontier models shift the balance in favor of attackers in cyberspace. The two discuss how labs and governments can take steps to address these asymmetries favoring attackers, and the future of cyber warfare driven by AI agents. Jack Mitchell, a student fellow in the AI Innovation and Law Program at the University of Texas School of Law, provided excellent research assistance on this episode.Check out Caleb's recent research here. Find Scaling Laws on the Lawfare website, and subscribe to never miss an episode.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

Arbiters of Truth
Caleb Withers on the Cybersecurity Frontier in the Age of AI

Arbiters of Truth

Play Episode Listen Later Dec 2, 2025 48:17


Caleb Withers, a researcher at the Center for a New American Security, joins Kevin Frazier, the AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to discuss how frontier models shift the balance in favor of attackers in cyberspace. The two discuss how labs and governments can take steps to address these asymmetries favoring attackers, and the future of cyber warfare driven by AI agents.Jack Mitchell, a student fellow in the AI Innovation and Law Program at the University of Texas School of Law, provided excellent research assistance on this episode.Check out Caleb's recent research here. Hosted on Acast. See acast.com/privacy for more information.

Arbiters of Truth
A Startup's Perspective on AI Policy

Arbiters of Truth

Play Episode Listen Later Nov 25, 2025 51:48


Andrew Prystai, CEO and co-founder of Vesta, and Thomas Bueler-Faudree, co-founder of August Law, join Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to think through AI policy from the startup perspective. Andrew and Thomas are the sorts of entrepreneurs that politicians on both sides of the aisle talk about at town halls and press releases. They're creating jobs and pushing the technological frontier. So what do they want AI policy leaders to know as lawmakers across the country weigh regulatory proposals? That's the core question of the episode. Giddy up for a great chat! Learn more about the guests and their companies here:Andrew's Linkedin, Vesta's LinkedinThomas's LinkedIn, August's LinkedIn Hosted on Acast. See acast.com/privacy for more information.

Pastor David Walker-
Conversation Series - Ep18 - Kevin Frazier

Pastor David Walker- "There Is More!"

Play Episode Listen Later Nov 24, 2025 37:02


Episode 18 | There is More with David Walker In this powerful episode, David sits down with Kevin Frazier, a Texas Parks and Wildlife Game Warden with more than 24 years of service. From chasing smugglers on Falcon Lake to leading life-saving flood rescues, Kevin has seen the best and worst of humanity — and through it all, he's learned the power of faith, teamwork, and compassion.Kevin opens up about his childhood dream of becoming a game warden, inspired by his Boy Scout master, and the long road that brought him there. But it's his story from the Uvalde flood rescue that leaves the deepest mark — the day he and his team searched for missing children and found a young girl who reminded him of his own daughters. In that moment, he stopped everything to pray over her, saying, “If you don't want to be a part of this, walk away, but I'm going to pray.”This conversation reveals the heart behind the badge — a man of faith, humility, and courage who serves because he believes in helping others, even when it hurts.

Arbiters of Truth
Anthropic's General Counsel, Jeff Bleich, Explores the Intersection of Law, Business, and Emerging Technology

Arbiters of Truth

Play Episode Listen Later Nov 18, 2025 36:51


Jeff Bleich, General Counsel at Anthropic, former Chief Legal Officer at Cruise, and former Ambassador to Australia during the Obama administration, joins Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to get a sense of how the practice of law looks at the edge of the AI frontier.The two also review how Jeff's prior work in the autonomous vehicle space prepared him for the challenges and opportunities posed by navigating legal uncertainties in AI governance. Hosted on Acast. See acast.com/privacy for more information.

The Lawfare Podcast
Scaling Laws: The AI Economy and You: How AI Is, Will, and May Alter the Nature of Work and Economic Growth with Anton Korinek, Nathan Goldschlag, and Bharat Chander

The Lawfare Podcast

Play Episode Listen Later Nov 14, 2025 44:44


Anton Korinek, a professor of economics at the University of Virginia and newly appointed economist to Anthropic's Economic Advisory Council; Nathan Goldschlag, Director of Research at the Economic Innovation Group; and Bharat Chander, Economist at Stanford Digital Economy Lab, join Kevin Frazier, the AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to sort through the myths, truths, and ambiguities that shape the important debate around the effects of AI on jobs. They discuss what happens when machines begin to outperform humans in virtually every computer-based task, how that transition might unfold, and what policy interventions could ensure broadly shared prosperity.These three are prolific researchers. Give them a follow to find their latest works:Anton: @akorinek on XNathan: @ngoldschlag and @InnovateEconomy on XBharat: X: @BharatKChandar, LinkedIn: @bharatchandar, Substack: @bharatchandarFind Scaling Laws on the Lawfare website, and subscribe to never miss an episode.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

The ALL NEW Big Wakeup Call with Ryan Gatenby

Send us a textA fun chat with comedian and actor ARIES SPEARS, who is performing at The Improv in Schaumburg this weekend with 7 and 9:30 shows on Friday, 3:30, 7, and 9:30 on Saturday, and 6:00 on Sunday.ARIES SPEARSBiographyEver since New York native Aries Spears was 14 years old, he has been a force to be reckoned with in the comedy scene throughout America. His quick wit, charisma and ferociously aggressive style of comedy have earned him critical acclaim, high accolades, and above all, a busy schedule. From being a regular on Fox's Mad TV, starring in feature films, appearing on a number of national talk shows, and continually touring the country where he continually sells out each and every city, Aries' talents are widespread as he just passed the 3 million spins played on Pandora and his podcast is one of the "Musts" in the US. This past summer, Aries guest starred in CNN's “See It Loud” new summer series on the history of Black Television which was one of their hit docuseries of 2023, alongside Amanda Seales, Da'Vinchi, Debbie Allen, Deon Cole, Desus & Mero, Gabrielle Union, Jimmie Walker, Judge Greg Mathis, Kevin Frazier, Loni Love, Lynn Whitfield, Mo'Nique, Naturi Naughton, Omari Hardwick, Ray J, Ruben Studdard, Sherri Shepherd, Tatyana Ali, Tiffany Pollard, Tisha Campbell, Vivica A. Fox and many more. Trailer: https://tinyurl.com/AriesCNNSeeItLoudAs a principal cast member on Fox's hit sketch comedy show Mad TV, Aries brought a fresh, hip style to the already-edgy program from the third through the tenth season. The producers made full use of many of Aries's talents by calling on him frequently to create new hilarious characters and write sketches where he thrived for eight seasons.Some of the many recurring characters that Aries is famous for on the show include: Belma Buttons, Bill Cosby, Jesse Jackson, Mike Tyson, Walter (Crackheads), Reggie (Erascist), Dollar Bill Montgomery, Shaquille O'Neal, The Klumps, Michael Jackson, Sisqo, Evander Holyfield, El Diablo Negro and more. Aries also boasts a number of uproarious impersonations on the show that include James Brown, Al Pacino, and his childhood idol Eddie Murphy. The Aries-branded sketch “Talkin' American” was Mad TV's most popular bit, keeping fans tuned in and boosting Mad's ratings impressively on Saturday night.Aries has also found great success in the world of feature films. At the age of 17, after being spotted in a comedy club, Aries landed a part in the movie Home of Angels which eventually led to a starring role alongside John Leguizamo in TriStar's The Pest followed by the notable role as Cuba Gooding, Jr.'s brother in Columbia's blockbuster hit, Jerry Maguire.Aries maintains a strong connection to his African roots and is involved in many charitable events, including a number of organizations that help abused women. He feels blessed to have been able to achieve everything that he has, but he in no way intends on stopping there. Given the extreme growing success rate of action comedies coming out of Hollywood, it's safe to say that Aries Spears will be in demand for quite some time.   https://www.ariesspears.com 

Arbiters of Truth
The AI Economy and You: How AI Is, Will, and May Alter the Nature of Work and Economic Growth with Anton Korinek, Nathan Goldschlag, and Bharat Chander

Arbiters of Truth

Play Episode Listen Later Nov 11, 2025 43:56


Anton Korinek, a professor of economics at the University of Virginia and newly appointed economist to Anthropic's Economic Advisory Council, Nathan Goldschlag, Director of Research at the Economic Innovation Group, and Bharat Chander, Economist at Stanford Digital Economy Lab, join Kevin Frazier, the AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to sort through the myths, truths, and ambiguities that shape the important debate around the effects of AI on jobs. We discuss what happens when machines begin to outperform humans in virtually every computer-based task, how that transition might unfold, and what policy interventions could ensure broadly shared prosperity.These three are prolific researchers. Give them a follow to find their latest works.Anton: @akorinek on XNathan: @ngoldschlag and @InnovateEconomy on XBharat: X: @BharatKChandar, LinkedIn: @bharatchandar, Substack: @bharatchandar Hosted on Acast. See acast.com/privacy for more information.

The John Batchelor Show
53: Regulating AI and Protecting Children. Kevin Frazier (Law School Fellow at the University of Texas at Austin) addresses the growing concern over AI chatbots following tragedies, noting that while only 1.9% of ChatGPT conversations relate to "rela

The John Batchelor Show

Play Episode Listen Later Nov 6, 2025 11:00


Regulating AI and Protecting Children. Kevin Frazier (Law School Fellow at the University of Texas at Austin) addresses the growing concern over AI chatbots following tragedies, noting that while only 1.9% of ChatGPT conversations relate to "relationships," this fraction still warrants significant attention. He criticizes early state legislative responses, such as Illinois banning AI therapy tools, arguing that such actions risk denying mental health support to children who cannot access human therapists. Frazier advocates against imposing restrictive statutory law on the rapidly evolving technology. Instead, he recommends implementing a voluntary, standardized rating system, similar to the MPA film rating system. This framework would provide consumers with digestible information via labels—like "child safe" or "mental health appropriate"—to make informed decisions and incentivize industry stakeholders to develop safer applications. 1919

The John Batchelor Show
53: Regulating AI and Protecting Children. Kevin Frazier (Law School Fellow at the University of Texas at Austin) addresses the growing concern over AI chatbots following tragedies, noting that while only 1.9% of ChatGPT conversations relate to "rela

The John Batchelor Show

Play Episode Listen Later Nov 6, 2025 6:49


Regulating AI and Protecting Children. Kevin Frazier (Law School Fellow at the University of Texas at Austin) addresses the growing concern over AI chatbots following tragedies, noting that while only 1.9% of ChatGPT conversations relate to "relationships," this fraction still warrants significant attention. He criticizes early state legislative responses, such as Illinois banning AI therapy tools, arguing that such actions risk denying mental health support to children who cannot access human therapists. Frazier advocates against imposing restrictive statutory law on the rapidly evolving technology. Instead, he recommends implementing a voluntary, standardized rating system, similar to the MPA film rating system. This framework would provide consumers with digestible information via labels—like "child safe" or "mental health appropriate"—to make informed decisions and incentivize industry stakeholders to develop safer applications. 1941

The John Batchelor Show
51: PREVIEW. The Crisis of AI Literacy: Protecting Vulnerable Communities from Misusing Chatbots. Kevin Frazier discusses the dangers of young people misusing AI chatbots due to a significant lack of public awareness and basic AI literacy. Designers assum

The John Batchelor Show

Play Episode Listen Later Nov 5, 2025 3:10


PREVIEW. The Crisis of AI Literacy: Protecting Vulnerable Communities from Misusing Chatbots. Kevin Frazier discusses the dangers of young people misusing AI chatbots due to a significant lack of public awareness and basic AI literacy. Designers assume users know chatbots are merely objectification and optimization, not real opinions or people. Frazier stresses the need for educating consumers on the best and improper uses of these tools for responsible innovation. 1951

Arbiters of Truth
Anthropic's Gabriel Nicholas Analyzes AI Agents

Arbiters of Truth

Play Episode Listen Later Nov 4, 2025 48:50


Gabriel Nicholas, a member of the Product Public Policy team at Anthropic, joins Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to introduce the policy problems (and some solutions) posed by AI agents. Defined as AI tools capable of autonomously completing tasks on your behalf, it's widely expected that AI agents will soon become ubiquitous. The integration of AI agents into sensitive tasks presents a slew of technical, social, economic, and political questions. Gabriel walks through the weighty questions that labs are thinking through as AI agents finally become “a thing.” Hosted on Acast. See acast.com/privacy for more information.

Let People Prosper
Harnessing AI for Human Flourishing with Kevin Frazier | Let People Prosper Ep. 172

Let People Prosper

Play Episode Listen Later Oct 30, 2025 46:44


Artificial intelligence isn't just transforming industries—it's redefining freedom, opportunity, and the future of human work. This week on the Let People Prosper Show, I talk with Kevin Frazier, the inaugural AI Innovation and Law Fellow at the University of Texas School of Law, where he leads their groundbreaking new AI Innovation and Law Program.Kevin's at the center of the national conversation on how to balance innovation with accountability—and how to make sure regulation doesn't crush the technological progress that drives prosperity. With degrees from UC Berkeley Law, Harvard Kennedy School, and the University of Oregon, Kevin brings both a legal and policy lens to today's most pressing questions about AI, federalism, and the economy. Before joining UT, he served as an Assistant Professor at St. Thomas University College of Law and conducted research for the Institute for Law and AI. His scholarship has appeared in the Tennessee Law Review, MIT Technology Review, and Lawfare. He also co-hosts the Scaling Laws Podcast, bridging the gap between innovation and regulation.This episode goes deep into how we can harness AI to promote human flourishing, not government dependency—how we can regulate based on reality, not fear—and how federalism can help America remain the global leader in technological innovation.For more insights, visit vanceginn.com. You can also get even greater value by subscribing to my Substack newsletter at vanceginn.substack.com. Please share with your friends, family, and broader social media network. 

The Lawfare Podcast
Scaling Laws: Sen. Scott Wiener on California Senate Bill 53

The Lawfare Podcast

Play Episode Listen Later Oct 24, 2025 50:10


California State Senator Scott Wiener, author of Senate Bill 53—a frontier AI safety bill—signed into law by Governor Newsom earlier this month, joins Alan Rozenshtein, Associate Professor at Minnesota Law and Research Director at Lawfare, and Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to explain the significance of SB 53 in the large debate about how to govern AI.The trio analyze the lessons that Senator Wiener learned from the battle of SB 1047, a related bill that Newsom vetoed last year, explore SB 53's key provisions, and forecast what may be coming next in Sacramento and D.C.Find Scaling Laws on the Lawfare website, and subscribe to never miss an episode.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

Arbiters of Truth
Sen. Scott Wiener on California Senate Bill 53

Arbiters of Truth

Play Episode Listen Later Oct 21, 2025 49:06


California State Senator Scott Wiener, author of Senate Bill 53--a frontier AI safety bill--signed into law by Governor Newsom earlier this month, joins Alan Rozenshtein, Associate Professor at Minnesota Law and Research Director at Lawfare, and Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to explain the significance of SB 53 in the large debate about how to govern AI.The trio analyze the lessons that Senator Wiener learned from the battle of SB 1047, a related bill that Newsom vetoed last year, explore SB 53's key provisions, and forecast what may be coming next in Sacramento and D.C. Hosted on Acast. See acast.com/privacy for more information.

The Lawfare Podcast
Scaling Laws: AI and Energy: What Do We Know? What Are We Learning?

The Lawfare Podcast

Play Episode Listen Later Oct 17, 2025 52:18


Mosharaf Chowdhury, Associate Professor at the University of Michigan and Director of the ML Energy lab, and Dan Zhou, former Senior Research Scientist at the MIT Lincoln Lab, MIT Supercomputing Center, and MIT CSAIL, join Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to discuss the energy costs of AI. They break down exactly how much energy fuels a single ChatGPT query, why this is difficult to figure out, how we might improve energy efficiency, and what kinds of policies might minimize AI's growing energy and environmental costs. Leo Wu provided excellent research assistance on this podcast.Read more from Mosharaf:The ML Energy Initiative“We did the math on AI's energy footprint. Here's the story you haven't heard,” in MIT Technology ReviewRead more from Dan:“From Words to Watts: Benchmarking the Energy Costs of Large Language Model Inference,” in Proc. IEEE High Perform. Extreme Comput. Conf. (HPEC)“A Green(er) World for A.I.,” in IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW)Find Scaling Laws on the Lawfare website, and subscribe to never miss an episode.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

The John Batchelor Show
HEADLINE: AI Regulation Debate: Premature Laws vs. Emerging Norms GUEST NAME: Kevin Frazier SUMMARY: Kevin Frazier critiques the legislative rush to regulate AI, arguing that developing norms might be more effective than premature laws. He notes that bill

The John Batchelor Show

Play Episode Listen Later Oct 16, 2025 13:46


    HEADLINE: AI Regulation Debate: Premature Laws vs. Emerging Norms GUEST NAME: Kevin Frazier SUMMARY: Kevin Frazier critiques the legislative rush to regulate AI, arguing that developing norms might be more effective than premature laws. He notes that bills like California's AB 1047, which demands factual accuracy, fundamentally misunderstand AI's generative nature. Imposing vague standards, as seen in New York's RAISE Act, risks chilling innovation and preventing widespread benefits, like affordable legal or therapy tools. Frazier emphasizes that AI policy should be grounded in empirical data rather than speculative fears. 1960

The John Batchelor Show
HEADLINE: AI Regulation Debate: Premature Laws vs. Emerging Norms GUEST NAME: Kevin Frazier SUMMARY: Kevin Frazier critiques the legislative rush to regulate AI, arguing that developing norms might be more effective than premature laws. He notes that bill

The John Batchelor Show

Play Episode Listen Later Oct 16, 2025 5:54


HEADLINE: AI Regulation Debate: Premature Laws vs. Emerging Norms GUEST NAME: Kevin Frazier SUMMARY: Kevin Frazier critiques the legislative rush to regulate AI, arguing that developing norms might be more effective than premature laws. He notes that bills like California's AB 1047, which demands factual accuracy, fundamentally misunderstand AI's generative nature. Imposing vague standards, as seen in New York's RAISE Act, risks chilling innovation and preventing widespread benefits, like affordable legal or therapy tools. Frazier emphasizes that AI policy should be grounded in empirical data rather than speculative fears. 1958

Arbiters of Truth
AI and Energy: What do we know? What are we learning?

Arbiters of Truth

Play Episode Listen Later Oct 14, 2025 51:32


Mosharaf Chowdhury, associate professor at the University of Michigan and director of the ML Energy lab, and Dan Zhao, AI researcher at MIT, GoogleX, and Microsoft focused on AI for science and sustainable and energy-efficient AI, join Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to discuss the energy costs of AI. They break down exactly how much a energy fuels a single ChatGPT query, why this is difficult to figure out, how we might improve energy efficiency, and what kinds of policies might minimize AI's growing energy and environmental costs. Leo Wu provided excellent research assistance on this podcast. Read more from Mosharaf:https://ml.energy/ https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/ Read more from Dan:https://arxiv.org/abs/2310.03003'https://arxiv.org/abs/2301.11581 Hosted on Acast. See acast.com/privacy for more information.

The Lawfare Podcast
Scaling Laws: AI Safety Meet Trust & Safety with Ravi Iyer and David Sullivan

The Lawfare Podcast

Play Episode Listen Later Oct 10, 2025 47:29


David Sullivan, Executive Director of the Digital Trust & Safety Partnership, and Rayi Iyer, Managing Director of the Psychology of Technology Institute at USC's Neely Center, join Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to discuss the evolution of the Trust & Safety field and its relevance to ongoing conversations about how best to govern AI. They discuss the importance of thinking about the end user in regulation, debate the differences and similarities between social media and AI companions, and evaluate current policy proposals.Leo Wu provided excellent research assistance to prepare for this podcast.Read more from David:"Why we need to make safety the product to build better bots," from the World Economic Forum Centre for AI Excellence"Learning from the Past to Shape the Future of Digital Trust and Safety," in Tech Policy PressRead more from Ravi:"Ravi Iyer on How to Improve Technology Through Design," from Lawfare's Arbiters of Truth series"Regulate Design, not Speech," from the Designing Tomorrow Substack Read more from Kevin:"California in Your Chatroom: AB 1064's Likely Constitutional Overreach," from the Cato InstituteFind Scaling Laws on the Lawfare website, and subscribe to never miss an episode.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

Arbiters of Truth
AI Safety Meet Trust & Safety with Ravi Iyer and David Sullivan

Arbiters of Truth

Play Episode Listen Later Oct 7, 2025 46:40


David Sullivan, Executive Director of the Digital Trust & Safety Partnership, and Rayi Iyer, Managing Director of the Psychology of Technology Institute at USC's Neely Center, join join Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to discuss the evolution of the Trust & Safety field and its relevance to ongoing conversations about how best to govern AI. They discuss the importance of thinking about the end user in regulation, debate the differences and similarities between social media and AI companions, and evaluate current policy proposals. You'll “like” (bad pun intended) this one. Leo Wu provided excellent research assistance to prepare for this podcast. Read more from David:https://www.weforum.org/stories/2025/08/safety-product-build-better-bots/https://www.techpolicy.press/learning-from-the-past-to-shape-the-future-of-digital-trust-and-safety/ Read more from Ravi:https://shows.acast.com/arbiters-of-truth/episodes/ravi-iyer-on-how-to-improve-technology-through-designhttps://open.substack.com/pub/psychoftech/p/regulate-value-aligned-design-not?r=2alyy0&utm_campaign=post&utm_medium=web&showWelcomeOnShare=false Read more from Kevin:https://www.cato.org/blog/california-chatroom-ab-1064s-likely-constitutional-overreach Hosted on Acast. See acast.com/privacy for more information.

The Lawfare Podcast
Lawfare Archive: Bob Bauer and Liza Goitein on Emergency Powers Reform

The Lawfare Podcast

Play Episode Listen Later Oct 4, 2025 47:36


From September 20, 2024: Bob Bauer, Professor of Practice and Distinguished Scholar in Residence at New York University School of Law, and Liza Goitein, Senior Director of Liberty & National Security at the Brennan Center, join Kevin Frazier, Assistant Professor at St. Thomas University College of Law and a Tarbell Fellow at Lawfare, to review the emergency powers afforded to the president under the National Emergency Act, International Emergency Economic Powers Act, and the Insurrection Act. The trio also inspect ongoing bipartisan efforts to reform emergency powers.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

RTP's Free Lunch Podcast
Law for Little Tech: Part 6 - Does the Little Tech Agenda Work for Startups?

RTP's Free Lunch Podcast

Play Episode Listen Later Oct 3, 2025 32:02 Transcription Available


Startups often struggle to balance financial constraints with the pursuit of innovation, raising questions about how they can effectively advocate for themselves within the tech industry. In Washington, D.C. and abroad, various organizations promote the growth of smaller innovators, yet many "little tech" firms still face challenges meeting regulatory requirements. How do regulatory frameworks affect smaller innovators and their ability to compete? What balance should be struck between oversight and innovation? How can policymakers incentivize little tech companies without creating a disadvantage for Big Tech firms or consumers?Join the Federalist Society’s Regulatory Transparency Project and host Prof. Kevin Frazier for an in-depth discussion of the “Little Tech Agenda” with special guest Kate Tummarello at Engine | Advocacy & Foundation.

washington tech foundation startups prof big tech federalist society kevin frazier regulatory transparency project telecommunications & electroni regulatory transparency projec
RTP's Free Lunch Podcast
Law for Little Tech: Part 4 - What are the Gaps in the Little Tech Agenda?

RTP's Free Lunch Podcast

Play Episode Listen Later Oct 3, 2025 37:56 Transcription Available


“Starting small, but aspiring to grow” defines the little tech agenda. Big Tech companies often depend on smaller innovators for key components of manufacturing and new technologies. With this dependence on little tech, what are the “gaps” in its agenda? The U.S. has technological capital waiting to be unlocked by small innovators. What steps can be taken to address this gap and channel little tech's efforts towards our national interests? Can we strike a balance between Big Tech and little tech to further the goals of the United States’ technological development? Join the Federalist Society’s Regulatory Transparency Project and host Prof. Kevin Frazier for an in-depth discussion of the “Little Tech Agenda” with special guest Sam Hammond, Foundation of American Innovation.

united states starting tech foundation prof big tech gaps federalist society american innovations kevin frazier sam hammond regulatory transparency project telecommunications & electroni regulatory transparency projec
RTP's Free Lunch Podcast
Law for Little Tech: Part 5 - The Influence of Sand Hill Road on the Little Tech Agenda

RTP's Free Lunch Podcast

Play Episode Listen Later Oct 3, 2025 34:41 Transcription Available


Over the past 30 years, the United States has experienced rapid technological change. Yet in recent years, innovation appears to have plateaued. The iPhone of four years ago is nearly identical to today’s model, and the internet has changed little over the same period. Little tech companies play a significant role in generating new ideas and technological development. In this episode, experts discuss the financial gains and risks of incentivising little tech innovation and offer policy recommendations that encourage investment in the "littlest tech" firms to drive future breakthroughs.Join the Federalist Society’s Regulatory Transparency Project and host Prof. Kevin Frazier for an in-depth discussion of the “Little Tech Agenda” with special guest Dave Karpf, Associate Professor at the George Washington University School of Media and Public Affairs.

The Lawfare Podcast
Scaling Laws: The Ivory Tower and AI (Live from IHS's Technology, Liberalism, and Abundance Conference)

The Lawfare Podcast

Play Episode Listen Later Oct 1, 2025 43:22


Neil Chilson, Head of AI Policy at the Abundance Institute, and Gus Hurwitz, Senior Fellow and CTIC Academic Director at Penn Carey Law School and Director of Law & Economics Programs at the International Center for Law & Economics, join Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to explore how academics can overcome the silos and incentives that plague the Ivory Tower and positively contribute to the highly complex, evolving, and interdisciplinary work associated with AI governance.The trio recorded this podcast live at the Institute for Humane Studies's Technology, Liberalism, and Abundance Conference in Arlington, Virginia.Read about Kevin's thinking on the topic here: https://www.civitasinstitute.org/research/draining-the-ivory-towerLearn about the Conference: https://www.theihs.org/blog/curated-event/technology-abundance-and-liberalism/Find Scaling Laws on the Lawfare website, and subscribe to never miss an episode.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

The John Batchelor Show
HEADLINE: Russian Spy Ships Target Vulnerable Undersea Communication Cables GUEST NAME: Kevin Frazier 50 WORD SUMMARY: Undersea cables are highly vulnerable to sabotage or accidental breaks. Russia uses sophisticated naval technology, including the spy sh

The John Batchelor Show

Play Episode Listen Later Sep 30, 2025 10:30


HEADLINE: Russian Spy Ships Target Vulnerable Undersea Communication Cables GUEST NAME: Kevin Frazier50 WORD SUMMARY: Undersea cables are highly vulnerable to sabotage or accidental breaks. Russia uses sophisticated naval technology, including the spy ship Yantar, to map and potentially break these cables in sensitive locations. The US is less vulnerable due to redundancy. However, protection is fragmented, relying on private owners who often lack incentives to adopt sophisticated defense techniques. 1945 RED SQUARE

The John Batchelor Show
Preview: Kevin Frazier discusses the extreme vulnerability and fragmented state of undersea cables, the vast majority of which are privately owned. The Department of Defense relies on these systems, which lack sufficient protection due to high costs. Fraz

The John Batchelor Show

Play Episode Listen Later Sep 29, 2025 2:22


Preview: Kevin Frazier discusses the extreme vulnerability and fragmented state of undersea cables, the vast majority of which are privately owned. The Department of Defense relies on these systems, which lack sufficient protection due to high costs. Frazier highlights recent reports that the Russian ship Yantar, under GRU possession, is tracking and mapping these vital cables near Great Britain in the event of conflict.

The John Batchelor Show
Preview: Kevin Frazier discusses the extreme vulnerability and fragmented state of undersea cables, the vast majority of which are privately owned. The Department of Defense relies on these systems, which lack sufficient protection due to high costs. Fraz

The John Batchelor Show

Play Episode Listen Later Sep 29, 2025 1:40


Preview: Kevin Frazier discusses the extreme vulnerability and fragmented state of undersea cables, the vast majority of which are privately owned. The Department of Defense relies on these systems, which lack sufficient protection due to high costs. Frazier highlights recent reports that the Russian ship Yantar, under GRU possession, is tracking and mapping these vital cables near Great Britain in the event of conflict.

The John Batchelor Show
Kevin Frazier testified that Congress needs a national vision to manage data center infrastructure and mitigate local impacts. He stressed vulnerable undersea cables are neglected and urged academics to prioritize teaching and public-oriented research.

The John Batchelor Show

Play Episode Listen Later Sep 25, 2025 9:56


Kevin Frazier testified that Congress needs a national vision to manage data center infrastructure and mitigate local impacts. He stressed vulnerable undersea cables are neglected and urged academics to prioritize teaching and public-oriented research.

The John Batchelor Show
Kevin Frazier testified that Congress needs a national vision to manage data center infrastructure and mitigate local impacts. He stressed vulnerable undersea cables are neglected and urged academics to prioritize teaching and public-oriented research.

The John Batchelor Show

Play Episode Listen Later Sep 25, 2025 9:44


Kevin Frazier testified that Congress needs a national vision to manage data center infrastructure and mitigate local impacts. He stressed vulnerable undersea cables are neglected and urged academics to prioritize teaching and public-oriented research. 1939

The John Batchelor Show
Preview: Kevin Frazier of University of Texas Law School/Civitas Institute discusses congressional concerns over AI regulation, balancing state interests versus federal goals of preventing cross-state policy projection and prioritizing national AI innovat

The John Batchelor Show

Play Episode Listen Later Sep 24, 2025 1:29


Preview: Kevin Frazier of University of Texas Law School/Civitas Institute discusses congressional concerns over AIregulation, balancing state interests versus federal goals of preventing cross-state policy projection and prioritizing national AI innovation and growth.

The Lawfare Podcast
Lawfare Archive: Jane Bambauer, Ramya Krishnan, and Alan Rozenshtein on the Constitutionality of the TikTok Bill

The Lawfare Podcast

Play Episode Listen Later Sep 21, 2025 42:55


From September 18, 2024: Jane Bambauer, Professor at Levin College of Law; Ramya Krishnan, Senior Staff Attorney at the Knight First Amendment Institute and a lecturer in law at Columbia Law School; Alan Rozenshtein, Associate Professor of Law at the University of Minnesota Law School and a Senior Editor at Lawfare, join Kevin Frazier, Assistant Professor at St. Thomas University College of Law and a Tarbell Fellow at Lawfare, to break down the D.C. Circuit Court of Appeals' hearing in TikTok v. Garland, in which a panel of judges assessed the constitutionality of the TikTok bill.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

The Lawfare Podcast
Scaling Laws: The State of AI Safety with Steven Adler

The Lawfare Podcast

Play Episode Listen Later Sep 12, 2025 49:14


Steven Adler, former OpenAI safety researcher, author of Clear-Eyed AI on Substack, and independent AGI-readiness researcher, joins Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and Senior Fellow at Lawfare, to assess the current state of AI testing and evaluations. The two walk through Steven's views on industry efforts to improve model testing and what he thinks regulators ought to know and do when it comes to preventing AI harms.Thanks to Leo Wu for research assistance!Find Scaling Laws on the Lawfare website, and subscribe to never miss an episode.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

The Lawfare Podcast
Scaling Laws: Contrasting and Conflicting Efforts to Regulate Big Tech: EU v. U.S.

The Lawfare Podcast

Play Episode Listen Later Sep 5, 2025 47:04


Anu Bradford, Professor at Columbia Law School, and Kate Klonick, Senior Editor at Lawfare and Associate Professor at St. John's University School of Law, join Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to assess the ongoing, contrasting, and, at times, conflicting regulatory approaches to Big Tech being pursued by the EU and U.S. The trio start with an assessment of the EU's use of the Brussels Effect, coined by Anu, to shape AI development. Next, they explore the U.S.'s increasingly interventionist industrial policy with respect to key sectors, especially tech.Read more:Anu's op-ed in The New York Times"The Impact of Regulation on Innovation," by Philippe Aghion, Antonin Bergeaud, and John Van ReenenDraghi Report on the Future of European CompetitivenessFind Scaling Laws on the Lawfare website, and subscribe to never miss an episode.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

The Lawfare Podcast
Lawfare Archive: Richard Albert on Constitutional Resilience Amid Political Tumult

The Lawfare Podcast

Play Episode Listen Later Aug 31, 2025 46:41


From August 23, 2024: Richard Albert, William Stamps Farish Professor in Law, Professor of Government, and Director of Constitutional Studies at the University of Texas at Austin, joins Kevin Frazier, Assistant Professor at St. Thomas University College of Law and a Tarbell Fellow at Lawfare, to conduct a comparative analysis of what helps constitutions withstand political pressures. Richard's extensive study of different means to amend constitutions shapes their conversation about whether the U.S. Constitution has become too rigid.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

The Lawfare Podcast
Scaling Laws: Uncle Sam Buys In: Examining the Intel Deal 

The Lawfare Podcast

Play Episode Listen Later Aug 29, 2025 48:22


Peter E. Harrell, Adjunct Senior Fellow at the Center for a New American Security, joins Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to examine the White House's announcement that it will take a 10% share of Intel. They dive into the policy rationale for the stake as well as its legality. Peter and Kevin also explore whether this is just the start of such deals given that President Trump recently declared that “there will be more transactions, if not in this industry then other industries.”Find Scaling Laws on the Lawfare website, and subscribe to never miss an episode.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

The John Batchelor Show
AI: REGULATING LLM - KEVIN FRAZIER, CIVITAS INSTITUTE

The John Batchelor Show

Play Episode Listen Later Aug 21, 2025 14:20


AI: REGULATING LLM - KEVIN FRAZIER, CIVITAS INSTITUTE 1941

The John Batchelor Show
AI: REGULATING LLM - KEVIN FRAZIER, CIVITAS INSTITUTE CONTINUED

The John Batchelor Show

Play Episode Listen Later Aug 21, 2025 3:30


AI: REGULATING LLM - KEVIN FRAZIER, CIVITAS INSTITUTE CONTINUED 1952

The John Batchelor Show
Preview: AGI Regulation Colleague Kevin Frazier comments on the tentative state of LLM that needs time to develop before it is either judged or derided by lawmakers. More later.

The John Batchelor Show

Play Episode Listen Later Aug 20, 2025 1:52


Preview: AGI Regulation Colleague Kevin Frazier comments on the tentative state of LLM that needs time to develop before it is either judged or derided by lawmakers. More later.

The Lawfare Podcast
Scaling Laws: What's Next in AI Policy (and for Dean Ball)?

The Lawfare Podcast

Play Episode Listen Later Aug 15, 2025 59:14


In this episode of Scaling Laws, Dean Ball, Senior Fellow at the Foundation for American Innovation and former Senior Policy Advisor for Artificial Intelligence and Emerging Technology, White House Office of Science and Technology Policy, joins Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, and Alan Rozenshtein, Associate Professor at Minnesota Law and Research Director at Lawfare, to share an inside perspective of the Trump administration's AI agenda, with a specific focus on the AI Action Plan. The trio also explore Dean's thoughts on the recently released ChatGPT-5 and the ongoing geopolitical dynamics shaping America's domestic AI policy.Find Scaling Laws on the Lawfare website, and subscribe to never miss an episode.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

The Lawfare Podcast
Scaling Laws: What Keeps OpenAI's Product Policy Staff Up at Night? A Conversation with Brian Fuller

The Lawfare Podcast

Play Episode Listen Later Aug 8, 2025 51:16


Brian Fuller, a member of the Product Policy Team at OpenAI, joins Kevin Frazier, the AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to analyze how large AI labs go about testing their models for compliance with internal requirements and various legal obligations. They also cover the ins and outs of what it means to work in product policy and what issues are front of mind for in-house policy teams amid substantial regulatory uncertainty.Find Scaling Laws on the Lawfare website, and subscribe to never miss an episode.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

The Lawfare Podcast
Scaling Laws: Renée DiResta and Alan Rozenshtein on the ‘Woke AI' Executive Order

The Lawfare Podcast

Play Episode Listen Later Aug 1, 2025 46:48


Renée DiResta, an Associate Research Professor at the McCourt School of Public Policy at Georgetown and a Contributing Editor at Lawfare, and Alan Rozenshtein, an Associate Professor at Minnesota Law, Research Director at Lawfare, and, with the exception of today, co-host on the Scaling Laws podcast, join Kevin Frazier, the AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to take a look at the Trump Administration's Woke AI policies, as set forth by a recent EO and explored in the AI Action Plan.Read the Woke AI executive orderRead the AI Action PlanRead "Generative Baseline Hell and the Regulation of Machine-Learning Foundation Models," by James Grimmelmann, Blake Reid, and Alan RozenshteinFind Scaling Laws on the Lawfare website, and subscribe to never miss an episode.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.