POPULARITY
In this episode of Human Rights Talks, we talk to Samuel Woolley, the Dietrich Endowed Chair in Disinformation Studies at the University of Pittsburgh. Samuel talks about the role of encrypted messaging apps in spreading mis-and disinformation, how it impacts diaspora communities in particular and democracy more generally, and how some organizations are fighting back. Samuel Woolley is a writer and researcher specializing in the study of automation/artificial intelligence, emergent technology, politics, persuasion and social media. He is currently the Dietrich Endowed Chair in Disinformation Studies at the University of Pittsburgh. Previously he founded the Propaganda Research Lab, Center for Media Engagement at The University of Texas at Austin. He also founded and directed the Digital Intelligence Lab at the Institute for the Future, a 50-year-old think tank based in the heart of Silicon Valley. He also cofounded and directed the research team at the Computational Propaganda Project at the Oxford Internet Institute, University of Oxford. He has written on political manipulation of technology for a variety of publications including Wired, The Atlantic, Motherboard VICE, TechCrunch, The Guardian, Quartz and Slate. His work has been presented to members of the North Atlantic Treaty Organization, the US Congress, the UK Parliament and to numerous private entities and civil society organizations.
The Cognitive Crucible is a forum that presents different perspectives and emerging thought leadership related to the information environment. The opinions expressed by guests are their own, and do not necessarily reflect the views of or endorsement by the Information Professionals Association. During this episode, Sam Wooley of the University of Texas School of Journalism discusses journalism, propaganda, and ethics. Our conversations unpacks the definition of propaganda and how today's technology fuels propaganda and influence. Research Question: Encrypted messaging apps (like WhatApp, Signal, Discord, etc) are becoming more popular, and incubation of disinformation campaigns happens in those spaces. How does disinformation and propaganda spread in encrypted spaces? How will we study propaganda in transport-layer encrypted spaces? Resources: Cognitive Crucible Podcast Episodes Mentioned #112 Jake Sotiriadis on the Value Proposition of Future Studies #107 Vanessa Otero on News Ecosystem Health #14 BDJ on Threatcasting #116 Matt Jackson on Social Learning and Game Theory Sam Wooley's Bio Manufacturing Consent: The Political Economy of the Mass Media by Edward S. Herman and Noam Chomsky Yellow Journalism Bots by Nick Monaco, Samuel Woolley Manufacturing Consensus: Understanding Propaganda in the Era of Automation and Anonymity by Sam Woolley Center for Media Engagement at University of Texas Link to full show notes and resources https://information-professionals.org/episode/cognitive-crucible-episode-117 Guest Bio: Samuel C. Woolley is an assistant professor in the School of Journalism and an assistant professor, by courtesy, in the School of Information--both at the University of Texas at Austin. He is also the project director for propaganda research at the Center for Media Engagement (CME) at UT. Woolley is currently a research associate at the Project for Democracy and the Internet at Stanford University. He has held past research affiliations at the Oxford Internet Institute, University of Oxford and the Center for Information Technology Research in the Interest of Society (CITRIS) at the University of California at Berkeley. Woolley's research is focused on how emergent technologies are used in and around global political communication. His work on computational propaganda—the use of social media in attempts to manipulate public opinion—has revealed the ways in which a wide variety of political groups in the United States and abroad have leveraged tools such as bots and trending algorithms and tactics of disinformation and trolling in efforts to control information flows online. His research on digital politics, automation/AI, social media, and political polarization is currently supported by grants from by Omidyar Network (ON), the Miami Foundation, and the Knight Foundation. His past research has been funded by the Ford Foundation, the Hewlett Foundation, the Open Society Foundations, the New Venture Fund for Communications, and others. His latest book, The Reality Game: How the Next Wave of Technology Will Break the Truth, was released in January 2020 by PublicAffairs (US) and Octopus/Endeavour (UK). It explores the ways in which emergent technologies--from deep fakes to virtual reality--are already being leveraged to manipulate public opinion, and how they are likely to be used in the future. He proposes strategic responses to these threats with the ultimate goal of empowering activists and pushing technology builders to design for democracy and human rights. He is currently working on two other books. Manufacturing Consensus (Yale University Press) explores the ways in which social media, and automated tools such as bots, have become global mechanisms for creating illusions of political support or popularity. He discusses the power of these tools for amplification and suppression of particular modes of digital communication, building on Herman and Chomsky's (1988) integral work on propaganda. His other book, co-authored with Nicholas Monaco, is titled Bots (Polity) and is a primer on the ways these automated tools have become integral to the flow of all manner of information online. Woolley is the co-editor, with Philip N. Howard (Oxford) of Computational Propaganda: Political Parties, Politicians, and Political Manipulation on Social Media, released in 2018 by the Oxford Studies in Digital Politics series at Oxford University Press. This volume of country specific case studies explores the rise of social media--and tools like algorithms and automation--as mechanisms for political manipulation around the world. He has published several peer-reviewed articles, book chapters, and white papers on emergent technology, the Internet and public life in publications such as the Journal of Information Technology and Politics, the International Journal of Communication, A Networked Self: Platforms, Stories, Connections, The Political Economy of Robots: Prospects for Prosperity and Peace in an Automated 21st Century, The Handbook of Media, Conflict and Security, and Can Public Diplomacy Survive the Internet? Bots, Echo Chambers and Disinformation. Woolley is the founding director of the Digital Intelligence Lab, a research and policy oriented project at the Institute for the Future—a 50-year-old think-tank located in Palo Alto, CA. Before this he served as the director of research at the National Science Foundation and European Research Council supported Computational Propaganda Project at the Oxford Internet Institute, University of Oxford. He is a former resident fellow at the German Marshall Fund's Digital Innovation Democracy Initiative and a former Belfer Fellow at the Anti-Defamation League's Center for Science and technology. He is a former research fellow at Jigsaw, Google's think-tank and technology incubator, at the Center Tech Policy Lab at the University of Washington's Schools of Law and Information, and at the Center for Media, Data and Society at Central European University. His public work on computational propaganda and social media bots has appeared in venues including Wired, the Guardian,TechCrunch, Motherboard, Slate, and The Atlantic. For his research, Woolley has been featured in publications such as the New York Times, the Washington Post, and the Guardian and on PBS' Frontline, BBC's News at Ten, and ABC's Today. His work on computational propaganda and bots has been presented to members of the U.S. Congress, the U.K. Parliament, NATO, and others. His Ph.D. is in Communication from the University of Washington. His website is samwoolley.org and he tweets from @samuelwoolley. About: The Information Professionals Association (IPA) is a non-profit organization dedicated to exploring the role of information activities, such as influence and cognitive security, within the national security sector and helping to bridge the divide between operations and research. Its goal is to increase interdisciplinary collaboration between scholars and practitioners and policymakers with an interest in this domain. For more information, please contact us at communications@information-professionals.org. Or, connect directly with The Cognitive Crucible podcast host, John Bicknell, on LinkedIn. Disclosure: As an Amazon Associate, 1) IPA earns from qualifying purchases, 2) IPA gets commissions for purchases made through links in this post.
After a wave of protests against vaccines and vaccine mandates in Europe, it is clear that despite very different policies across Europe, anti-vaccination movements that oppose them are remarkably alike. These movements are tightly linked, even though some of their connections are hidden. They amplify one another, and they have elaborate ways to support their activities financially - from selling books on Amazon to offering legal services to anyone who feels they have been harmed by vaccination. We had discussed this with disinformation researcher Dr Aliaksandr Herasimenka, so we thought we will come back to this interview and highlight his insights that explain the mobilisation around vaccines these days. A researcher at the Computational Propaganda Project of the Oxford Internet Institute, Herasimenka is a co-author of a paper on misinformation distribution on Telegram. The interview was edited for brevity and clarity. You can read the Alliance for Securing Democracy's research on China's propaganda and search engines here. We ran such a search ourselves, and the clips in the beginning of the episode are from the results we got. Our reporting is supported by Journalismfund.eu, Media Lab Bayern and Alfred Toepfer Stiftung. Please subscribe to our newsletter, and this show on Apple Podcasts, Audible, Google Podcasts, Spotify or another platform of your choice. Follow us on Facebook as @theinoculation, on Twitter as @TInoculation, and on Instagram as @the_inoculation
Aggressive political networks, associations of concerned parents and even state-affiliated media - many actors have been found to share misinformation about the COVID-19 vaccines currently used in the EU. "Oxford seems to have created a vaccine for monkeys," a host on Rossiya 1 channel said in September. Why are they doing this? And how can we measure the influence these messages have on internet users? To find out, Daiva and Eva talk to Dr Aliaksandr Herasimenka, a postdoctoral researcher at the Computational Propaganda Project at the Oxford Internet Institute. "The COVID-19 pandemic has already caused a huge political and social disruption across the world. We will see emergence of new types of political groups, political organisations that will be more disruptive than what we previously called populists. They will be potentially more damaging to democracy," he says. Herasimenka is a co-author of a paper on misinformation distribution on Telegram. You can read all his papers here. In this episode Eva mentions comparisons of pandemic-related restrictions to the Holocaust - you can read more about the phenomenon here, here and here. Our research is supported by Journalismfund.eu. Please subscribe to this show on Apple Podcasts, Audible, Google Podcasts, Spotify or another platform of your choice. Follow us on Facebook as @theinoculation, on Twitter as @TInoculation, and on Instagram as @the_inoculation
Thursday, 10 December 2020, 7 – 8:30pm Behind the Headlines: Women and Resistance in Belarus and Poland The recent anti-government protests in Belarus and Poland have attracted significant media attention across the world, with commentators highlighting not only the remarkable resilience of the protesters in the face of violence but also the predominance of women at the forefront of the protests themselves. In this Behind the Headlines discussion, our panel of experts will address the background to the demonstrations in each country before considering the key role played by the protestors -- and particularly the women involved -- in mobilising an effective and enduring opposition. Speakers: 'What went wrong – how women ended being the losers in Poland's ‘democratic' transition?' -Dr Jacqueline Hayden (TCD) Dr Jacqueline Hayden is director of the Centre for European Studies at Trinity College Dublin and author of Poles Apart: Solidarity and the New Poland and The Collapse of Communist Power in Poland: Strategic Misperceptions and Unanticipated Outcomes. She will explore how, since the election of the first PiS government, there has been a sharp resurgence of Catholic church values in political life. In seeking to institutionalise an ethnic-Catholic vision of the nation, both church and state in Poland have launched an attack on modernity. Women and the LGBT+ community have been the victims of this assault. Leading protests on and behind the scenes: Lessons from Belarus - Dr Aliaksandr Herasimenka (Oxford) Dr Herasimenka, is a postdoctoral researcher at the Computational Propaganda Project at the Oxford Internet Institute. His work investigates how political groups and governments use social media to manipulate public opinion. He also studies how people organise protest movements in authoritarian countries. In his talk, Dr Herasimenka will offer perspectives on the role of women in the post-election 2020 protests in Belarus, including their important role as symbolic leaders of the movement. SIGNS of the (feminist) revolution in Poland - Dr Aneta Stępień (Maynooth University) Dr Aneta Stępień is a university tutor in Critical Skills Programme at Maynooth University. She has taught on Polish language and literature, cultures of Central and Eastern Europe and gender and her recent article “Women's Organizations and Antisemitism: The First Parliamentary Elections in Independent Poland” (2020) appeared in Nationalities Papers. Her talk offers a feminist look on #StrajkKobiet, the Women's Protest in Poland, a movement that emerged in September 2016 as a wave of manifestations against the governmental proposal of restricting the abortion law. The protests in Belarus and Poland: a comparative overview - Balázs Apor (TCD) Dr Apor is associate professor in European Studies at Trinity College Dublin and a historian of Central and Eastern Europe in the 20th Century, with a special focus on the Communist period. His research interests include the study of propaganda and symbolic politics under Communist rule, and the Sovietisation of Eastern Europe after the Second World War. He will discuss the protests in Belarus and Poland from a comparative angle, highlighting common patterns as well as differences between the two movements, as well as reflecting on their the remarkable endurance. About Behind the Headlines The Trinity Long Room Hub's ‘Behind the Headlines' discussion series draws on the expertise of distinguished panel contributors to explore contemporary issues in the broad contexts of Arts and Humanities research. Introduced in 2015, the series provides a forum for public understanding and creates a valuable space for informed and respectful public discourse. Find out more about Behind the Headlines series here https://www.tcd.ie/trinitylongroomhub/whats-on/details/behind-the-headlines.php The Trinity Long Room Hub Behind the Headlines series is supported by the John Pollard Foundation.
Lisa-Maria Neudert of the Computational Propaganda Project was our guest for today's show. The Computational Propaganda Project is a research group that has spent the last year looking at the way bots on social media and “junk news" (aka fake news) have affected and attempted to influence elections in Germany, France, the US, and the UK. Sadly she only had half an hour to talk, but the short conversation that we had was fascinating. We got talking about the influence bots can have on social media, where bots first started to be used as propaganda, and how governments and tech companies can start to fight back against this wave of digital propaganda. Sponsor [https://www.change.org/p/richard-pengelly-fight-the-13-5-million-cuts-in-the-northern-trust](http://https://www.change.org/p/richard-pengelly-fight-the-13-5-million-cuts-in-the-northern-trust) [https://www.unison.org.uk/](http://https://www.unison.org.uk/) Resources Computational Propaganda Project - [http://comprop.oii.ox.ac.uk/](http://http://comprop.oii.ox.ac.uk/) [http://www.thejist.co.uk/podcast/chatter-episode-14-bret-weinstein-evolutionary-implications-technology-modern-society/](http://http://www.thejist.co.uk/podcast/chatter-episode-14-bret-weinstein-evolutionary-implications-technology-modern-society/) [http://www.thejist.co.uk/science-and-tech/russian-bots-us-election-trevors-axiom/](http://http://www.thejist.co.uk/science-and-tech/russian-bots-us-election-trevors-axiom/) Follow us on Facebook or Twitter or [sign up for our mailing list to get information on my upcoming book, Brexit: The Establishment Civil War](http://http://www.establishmentcivilwar.co.uk/). Music from Just Jim - [https://soundcloud.com/justjim](http://https://soundcloud.com/justjim)
Lisa-Maria Neudert of the Computational Propaganda Project was our guest for today's show. The Computational Propaganda Project is a research group that has spent the last year looking at the way bots on social media and “junk news" (aka fake news) have affected and attempted to influence elections in Germany, France, the US, and the UK. Sadly she only had half an hour to talk, but the short conversation that we had was fascinating. We got talking about the influence bots can have on social media, where bots first started to be used as propaganda, and how governments and tech companies can start to fight back against this wave of digital propaganda. Sponsor https://www.change.org/p/richard-pengelly-fight-the-13-5-million-cuts-in-the-northern-trust https://www.unison.org.uk/ Resources Computational Propaganda Project - http://comprop.oii.ox.ac.uk/ http://www.thejist.co.uk/podcast/chatter-episode-14-bret-weinstein-evolutionary-implications-technology-modern-society/ http://www.thejist.co.uk/science-and-tech/russian-bots-us-election-trevors-axiom/ Follow us on Facebook or Twitter or sign up for our mailing list to get information on my upcoming book, Brexit: The Establishment Civil War. Music from Just Jim - https://soundcloud.com/justjim
Samantha Bradshaw is a researcher at the Computational Propaganda Project and a doctoral candidate at the Oxford Internet Institute. She’s been tracking the phenomenon of political manipulation through social media.You can find Samantha on Twitter at @sbradshaww.The YC podcast is hosted by Craig Cannon.***Topics:53 - What is a bot?2:53 - When computational propaganda began3:53 - Changes in bot tactics since 20165:53 - Using bots for content creation7:28 - WhatsApp and the upcoming Indian election9:23 - Trends in computational propaganda10:53 - How bots integrate into platforms13:23 - Responsibilities of platforms to remove fake accounts14:53 - The role of governments in media manipulation18:18 - Fake news and selecting news that aligns with your beliefs19:53 - Are platforms getting better or worse?21:33 - Samantha's personal internet habits23:03 - Sentiment around tracking in the UK vs the US24:23 - The Mueller report and US midterms29:18 - Canadian elections30:18 - 2020 US elections30:53 - Deepfakes31:48 - Optimistic thoughts for the future33:08 - How to help against computational propaganda
Around the world, automated bot accounts have enabled some government agencies and political parties to exploit online platforms in dispersing messages, using keywords to game algorithms, and discrediting legitimate information on a mass scale. Through this they can spread junk news and disinformation; exercise censorship and control; and undermine trust in the media, public institutions and science. But is this form of propaganda really new? If so, what effect is it having on society? And is the worst yet to come as AI develops? Join our host, philosopher Peter Millican, as he explores this topic with Rasmus Nielsen, Director of Oxford’s Reuters Institute for the Study of Journalism; Vidya Narayanan, post-doctoral researcher in Oxford’s Computational Propaganda Project; and Mimie Liotsiou, also a post-doctoral researcher on the Computational Propaganda project who works on online social influence.
Your vote counts. But will your vote be counted? Alia and Bob team up again for a very special election episode to get to the bottom of Alia’s democratic anxiety: Are voting machines even the easiest way to hack an election? At what stage is your vote most vulnerable? And is democracy doomed in the digital age?!?! Hear from a whole slew of experts – hackers, cyber-security specialists, the team at DefCon’s “Voting Village”, and more – as we break out the full lifecycle of your vote and every hackable step along the way. We'll cover: DefCon presenting their Voting Village findings in DC. Hacking into voting machines 15 years ago with Harri Hursti (Black Box Voting hacker, originator of “The Hursti Hacks”). The vulnerability of voting systems and consequences of HAVA (Help America Vote Act) with tech journalist Kim Zetter (“The Crisis of Election Security”, New York Times). Disinformation and trolling campaigns with researcher Nick Monaco (Oxford Internet Institute, The Computational Propaganda Project; Google’s Jigsaw). The diversity of election systems with election expert Maggie MacAlpine (Nordic Innovation Labs). The impossibility of securing voting software with cryptography and system security researcher Matt Blaze (University of Pennsylvania). Vulnerability of voter registration and long lines with Jake Braun (Cambridge Global, University of Chicago’s Cyber Policy Initiative, Former Deputy National Field Director for President Obama, Organizer of Def Con’s Voting Village). Hacking demonstrations on electronic DRE voting machines with J. Alex Halderman (Michigan Center for Computer Security and Society). Transmitting votes and enlisting white-hat hackers with Mark Kuhr (crowd-sourcing cybersecurity company Synack). Auditing your local Secretary of State’s election security with Adam Levin (CyberScout). Breach is sponsored by Carbonite, how businesses protect their data. www.carbonite.com
Vidya Narayanan (Director of Research, Computational Propaganda Project, Oxford Internet Institute) delivers a lecture for The Business and Practice of Journalism Seminar Series.
Sam Woolley recently joined Institute for the Future as a Research Director and was previously the Director of Research at the Computational Propaganda Project at Oxford University. We asked Sam to share highlights of his research showing how political botnets—what he calls computational propaganda—are being used to influence public opinion.
Political manipulation and fake news have shaken trust in social media sites like Facebook and Twitter. Facebook's Mark Zuckerberg has vowed to make 2018 the year of big changes on the social media giant. And politicians around the world are threatening to bring in new regulations too. In Germany a new law is now forcing platforms to remove hate speech or face big fines. Join Ed Butler and guests for a discussion on who is to blame for the ills of social media - and how to fix them. Contributors:Samantha Bradshaw from the Computational Propaganda Project at Oxford University. Douglas Rushkoff , Professor of Media Theory and Digital Economics at City University of New York and author of Throwing Rocks at the Google Bus. Andreas Kluth, Editor-in-Chief of Handelsblatt Global, the online English-language edition of the German newspaper. Roger McNamee is an American businessman, investor and venture capitalist who was an early mentor to Mark Zuckerberg. Image: Mark Zuckerberg speaks on stage during the annual Facebook F8 developers conference in San Jose, California, U.S., April 18, 2017 (Credit: Reuters)
This episode is all about bots on social media with guest Samuel Woolley, Director of Research of the Computational Propaganda Project at the Oxford Internet Institute at the University of Oxford. We discuss exactly how users make bots, and the ways they are deployed on Facebook and Twitter to influence politics through, for example, spreading fake news or disrupting protests. Sam explains how bots are difficult to trace, since they are often geotagged in misleading locations or used for digital marketing. We also talk about bots in the latest 2016 US Presidential campaign between Donald Trump and Hillary Clinton, as well look forward a bit into how bots might evolve in the future. You can follow Sam on Twitter @Samuelwoolley, and check out the Computational Propaganda Project at www.politicalbots.org. Don't forget to subscribe on iTunes to the Social Media and Politics Podcast at: https://t.co/7Sdk88P86U Tweet us on Twitter: @SMandPPodcast Like us on Facebook: www.facebook.com/socialmediaandpoliticspodcast