POPULARITY
In this episode of the "AI & Equality" podcast, Senior Fellow Anja Kaspersen speaks with Heather Roff, senior research scientist at the The Center for Naval Analyses. They cover the gamut of AI systems and military affairs, from ethics and history, to robots, war, and conformity testing. Plus, they discuss how to become alchemists of meaning in the digital age. For more, please go to: https://carnegiecouncil.co/aiei-roff
In a rapidly evolving digital landscape, artificial intelligence (AI) has started revolutionizing every facet of our lives, including warfare. AI-driven robots and drones, equipped with machine-learning algorithms, navigate complex terrains, conduct surveillance, and execute missions with unparalleled precision. This ability to process vast amounts of data in real-time and make split-second decisions provides a critical advantage in the fast-paced environment of modern warfare, revolutionizing the scope of military engagements and offering new possibilities for tactical offensives and national security defense. As a result, AI-driven military spending is projected to reach $38.8 billion by 2028. However, integrating AI into warfare also raises important ethical and legal questions. How is the global geopolitical landscape being reshaped by nations investing heavily in AI-driven military technologies? How reliable and trustworthy are AI-driven decisions in high-stakes military scenarios? How might AI change the nature of future military engagements and warfare tactics? Today, we're joined by Dr. Heather Roff, Senior Research Scientist at the Center for Naval Analysis, Dr. Herbert Lin, Senior Research Scholar at the Center for International Security and Cooperation and Wendell Wallach, Co-director of the AI and Equality Initiative at the Carnegie Council for Ethics in International Affairs.Follow us at:Network2020.orgTwitter: @Network2020LinkedIn: Network 20/20Facebook: @network2020Instagram: @network_2020
CNA colleagues Kaia Haney and Heather Roff join Andy and Dave to discuss Responsible AI. They discuss the recent Inclusive National Security seminar on AI and National Security: Gender, Race, and Algorithms. The keynote speaker, Elizabeth Adams spoke on the challenges that society faces in integrating AI technologies in an inclusive fashion, and she identified ways in which consumers of AI-enabled products can ask questions and engage on the topic of inclusivity and bias. The group also discusses a variety of topics around the many challenges that organizations face in operationalizing these ideas, including a revisit of the findings from recent medical research, which found an algorithm was able to identify the race of a subject from x-rays and CAT scans, even with identifying features removed. Inclusive National Security Series: AI and National Security: Gender, Race and Algorithms Inclusive National Security webpage Sign up for the InclusiveNatSec mailing list here.
The world's superpowers are building new super-fast, super-lethal armies powered by artificial intelligence. AI warfare and its implications. Gen. John M. Murray, Patrick Turner and Heather Roff join Meghna Chakrabarti.
The robot future is female, but can it be feminist? Gendered digital assistants like Alexa are here to make our lives easier, yet all this artificial intelligence comes with retrograde human baggage (and unnecessary boobs). Caroline and Cristen investigate what robots reveal about the real-world with two of the brightest human minds out there. Follow Unladylike on social @unladylikemedia. And sign up for our newsletter at unladylike.co/newsletter. Find Unladylike Merch at unladylike.co/shop. Call our hotline with your friendship stories at 262-8-GALPAL. Hear exclusive bonus episodes of Unladylike on Stitcher Premium! Use promo code "UNLADYLIKE" at stitcher.com/premium for a free month trial. This episode is brought to you by Beyond Meat [http://beyondmeat.com/unladylike], Felix Gray [http://felixgray.com/unladylike], and Hello Fresh [http://hellofresh.com/unladylike30 with code UNLADYLIKE30]. See omnystudio.com/listener for privacy information.
The robot future is female, but can it be feminist? Gendered digital assistants like Alexa are here to make our lives easier, yet all this artificial intelligence comes with retrograde human baggage (and unnecessary boobs). Caroline and Cristen investigate what robots reveal about the real-world with two of the brightest human minds out there. Follow Unladylike on social @unladylikemedia. And sign up for our newsletter at unladylike.co/newsletter. Find Unladylike Merch at unladylike.co/shop. Call our hotline with your friendship stories at 262-8-GALPAL. Hear exclusive bonus episodes of Unladylike on Stitcher Premium! Use promo code "UNLADYLIKE" at stitcher.com/premium for a free month trial. This episode is brought to you by Beyond Meat [http://beyondmeat.com/unladylike], Felix Gray [http://felixgray.com/unladylike], and Hello Fresh [http://hellofresh.com/unladylike30 with code UNLADYLIKE30].
Defense and security experts discuss how Artificial Intelligence is changing the landscape of warfare and weapons. With China making a big push on A.I. research, how are competitors like the U.S. and India keeping up?
Stan Dempsey, head of the Colorado Mining Association, recently declared the so-called “War On Coal” to be over. Was there ever a war, or just market forces? One of the leading thinkers on artificial intelligence, Heather Roff, will speak this week at CU Boulder -- after we speak with her. And, products that are made in Colorado, from beer cans to chemicals to satellites, could be caught up in the big international trade battles now underway.
AI Ethicist Heather Roff joins Heather Ross and Adam Doupé to talk about the future of artificial intelligence and autonomous weapons systems. Heather Roff is a Research Scientist at ASU's Global Security Initiative, and a Senior Research Fellow at Oxford, Cambridge, and New America. Future Out Loud's new website: https://futureoutloud.org . Show Notes • Follow Heather Roff on Twitter @Hmroff. • Global Security Initiative at ASU: https://globalsecurity.asu.edu • More about the WannaCry ransomware attack: http://www.npr.org/sections/thetwo-way/2017/05/15/528451534/wannacry-ransomware-what-we-know-monday • More about Software Patches and OS Updates: https://ist.mit.edu/security/patches • DeepMind Health: https://deepmind.com/applied/deepmind-health/
Dr Heather Roff discusses the role of autonomous weapons systems within the international community. She provides a theoretical framework for defining and classifying these systems, examining the diplomatic and moral concerns that they pose. For the past three years the international community convened a series of informal meetings of experts under the auspices of the United Nation's Convention on Certain Conventional Weapons (CCW) to consider whether or not to preemptively ban lethal autonomous weapons systems under an additional protocol to the Convention. The debate has circled the same set of concerns: what exactly lethal autonomous weapon systems (AWS) are and whether it is incumbent upon states to ban them before they are developed. Without a definition states argue they cannot know what exactly it is they are supposed to ban. Yet after three years of expert testimony, there is no agreement on any meaningful definition. Diplomatic considerations are pressing, but Dr Heather Roff believes that the source of this confusion is due to an antecedent and more profound concern, one that is inherently tied to the question of defining what constitutes an AWS. In plain terms, it is a concern with the authorization to make war and the subsequent delegation of this authority. Until now, humans have been the sole agents authorized to make and to wage war, and questions of authorization and war have never been technologically dependent. Rather, they have been moral considerations and not empirical ones. She attempts to provide a theoretical framework for defining and classifying autonomous weapons systems. By so doing, she argues the moral quandary over autonomous weapons has its roots in concerns over the delegation of a (moral) faculty: the authority to wage war.
Dr Heather Roff discusses the role of autonomous weapons systems within the international community. She provides a theoretical framework for defining and classifying these systems, examining the diplomatic and moral concerns that they pose. For the past three years the international community convened a series of informal meetings of experts under the auspices of the United Nation’s Convention on Certain Conventional Weapons (CCW) to consider whether or not to preemptively ban lethal autonomous weapons systems under an additional protocol to the Convention. The debate has circled the same set of concerns: what exactly lethal autonomous weapon systems (AWS) are and whether it is incumbent upon states to ban them before they are developed. Without a definition states argue they cannot know what exactly it is they are supposed to ban. Yet after three years of expert testimony, there is no agreement on any meaningful definition. Diplomatic considerations are pressing, but Dr Heather Roff believes that the source of this confusion is due to an antecedent and more profound concern, one that is inherently tied to the question of defining what constitutes an AWS. In plain terms, it is a concern with the authorization to make war and the subsequent delegation of this authority. Until now, humans have been the sole agents authorized to make and to wage war, and questions of authorization and war have never been technologically dependent. Rather, they have been moral considerations and not empirical ones. She attempts to provide a theoretical framework for defining and classifying autonomous weapons systems. By so doing, she argues the moral quandary over autonomous weapons has its roots in concerns over the delegation of a (moral) faculty: the authority to wage war.
Drs. Heather Roff and Peter Asaro, two experts in autonomous weapons, talk about their work to understand and define the role of autonomous weapons, the problems with autonomous weapons, and why the ethical issues surrounding autonomous weapons are so much more complicated than other AI systems.