Podcasts about Citizen Lab

  • 197PODCASTS
  • 266EPISODES
  • 39mAVG DURATION
  • 1MONTHLY NEW EPISODE
  • Oct 27, 2025LATEST
Citizen Lab

POPULARITY

20172018201920202021202220232024


Best podcasts about Citizen Lab

Latest podcast episodes about Citizen Lab

The Iran Podcast
Prison Break: Israeli Disinfo Operations

The Iran Podcast

Play Episode Listen Later Oct 27, 2025 34:42


Negar Mortazavi speaks to Alberto Fittarelli about Israel-liked influence operations that were pushing for regime-change in Iran during the 12 day war. Alberto Fittarelli is a Senior Researcher at Citizen Lab at the University of Toronto.

CAISzeit – In welcher digitalen Gesellschaft wollen wir leben?
Grüne Innovationen (un)erwünscht? Das Pop-Up Citizen Lab als neue Form der Bürgerbeteiligung

CAISzeit – In welcher digitalen Gesellschaft wollen wir leben?

Play Episode Listen Later Oct 2, 2025 34:15


Wie stehen Bürger:innen eigentlich zu umwelt- und klimafreundlichen Innovationen? Im "Pop-Up Citizen Lab" des CAIS untersucht ein Forschungsteam in Bochum, wie Menschen auf grüne Innovationen reagieren – und zwar nicht etwa durch klassische Umfragen, sondern durch innovative Bürgerlabore, die direkt vor Ort stattfinden und in denen neuartige Formen der Beteiligung entwickelt werden. Im Mittelpunkt stehen dabei die Wärmewende und die Frage: Wie können Bürger:innen besser in Entscheidungen einbezogen werden, die ihren Alltag und ihre Nachbarschaft direkt betreffen? Über die Herausforderungen der Bürgerbeteiligung, über kreative Forschungsmethoden und darüber, warum es bei Innovationen eben nicht nur um Technik, sondern auch um die soziale Akzeptanz und Einbettung geht, spricht Pauline Heger mit Host Matthias Begenat.

Uncommons with Nate Erskine-Smith
The Strong Borders Act? with Kate Robertson and Adam Sadinsky

Uncommons with Nate Erskine-Smith

Play Episode Listen Later Oct 1, 2025 52:41


** There are less than 10 tickets remaining for the live recording of Uncommons with Catherine McKenna on Thursday Oct 2nd. Register for free here. **On this two-part episode of Uncommons, Nate digs into Bill C-2 and potential impacts on privacy, data surveillance and sharing with US authorities, and asylum claims and refugee protections.In the first half, Nate is joined by Kate Robertson, senior researcher at the University of Toronto's Citizen Lab. Kate's career has spanned criminal prosecutions, regulatory investigations, and international human rights work with the United Nations in Cambodia. She has advocated at every level of court in Canada, clerked at the Supreme Court, and has provided pro bono services through organizations like Human Rights Watch Canada. Her current research at Citizen Lab examines the intersection of technology, privacy, and the law.In part two, Nate is joined by Adam Sadinsky, a Toronto-based immigration and refugee lawyer and co-chair of the Canadian Association of Refugee Lawyers' Advocacy Committee. Adam has represented clients at every level of court in Canada, including the Supreme Court, and was co-counsel in M.A.A. v. D.E.M.E. (2020 ONCA 486) and Canadian Council for Refugees v. Canada (2023 SCC 17).Further Reading:Unspoken Implications A Preliminary Analysis of Bill C-2 and Canada's Potential Data-Sharing Obligations Towards the United States and Other Countries - Kate Robertson, Citizen LabKate Robertson Chapters:00:00 Introduction & Citizen Lab03:00 Bill C-2 and the Strong Borders Act08:00 Data Sharing and Human Rights Concerns15:00 The Cloud Act & International Agreements22:00 Real-World Examples & Privacy Risks28:00 Parliamentary Process & Fixing the BillAdam Sadinsky Chapters:33:33 Concerns Over Asylum Eligibility in Canada36:30 Government Goals and Fairness for Refugee Claimants39:00 Changing Country Conditions and New Risks41:30 The Niagara Falls Example & Other Unfair Exclusions44:00 Frivolous vs. Legitimate Claims in the Refugee System47:00 Clearing the Backlog with Fair Pathways50:00 Broad Powers Granted to the Government52:00 Privacy Concerns and Closing ReflectionsPart 1: Kate RobertsonNate Erskine-Smith00:00-00:01Kate, thanks for joining me.Kate Robertson00:01-00:01Thanks for having me.Nate Erskine-Smith00:02-00:15So I have had Ron Debert on the podcast before. So for people who really want to go back into the archive, they can learn a little bit about what the Citizen Lab is. But for those who are not that interested, you're a senior researcher there. What is the Citizen Lab?Kate Robertson00:16-01:00Well, it's an interdisciplinary research lab based at University of Toronto. It brings together researchers from a technology standpoint, political science, lawyers like myself and other disciplines to examine the intersection between information and communication technologies, law, human rights, and global security. And over time, it's published human rights reports about some of the controversial and emerging surveillance technologies of our time, including spyware or AI-driven technologies. And it's also really attempted to produce a thoughtful research that helps policymakers navigate some of these challenges and threats.Nate Erskine-Smith01:01-02:50That's a very good lead into this conversation because here we have Bill C-2 coming before Parliament for debate this fall, introduced in June, at the beginning of June. And it's called the Strong Borders Act in short, but it touches, I started counting, it's 15 different acts that are touched by this omnibus legislation. The government has laid out a rationale around strengthening our borders, keeping our borders secure, combating transnational organized crime, stopping the flow of illegal fentanyl, cracking down on money laundering, a litany of things that I think most people would look at and say broadly supportive of stopping these things from happening and making sure we're enhancing our security and the integrity of our immigration system and on. You, though, have provided some pretty thoughtful and detailed rational legal advice around some of the challenges you see in the bill. You're not the only one. There are other challenges on the asylum changes we're making. There are other challenges on lawful access and privacy. You've, though, highlighted, in keeping with the work of the Citizen Lab, the cross-border data sharing, the challenges with those data sharing provisions in the bill. It is a bit of a deep dive and a little wonky, but you've written a preliminary analysis of C2 and Canada's potential data sharing obligations towards the U.S. and other countries, unspoken implications, and you published it mid-June. It is incredibly relevant given the conversation we're having this fall. So if you were to at a high level, and we'll go ahead and some of the weeds, but at a high level articulate the main challenges you see in the legislation from the standpoint that you wrote in unspoken implications. Walk us through them.Kate Robertson02:51-06:15Well, before C2 was tabled for a number of years now, myself and other colleagues at the lab have been studying new and evolving ways that we're seeing law enforcement data sharing and cross-border cooperation mechanisms being put to use in new ways. We have seen within this realm some controversial data sharing frameworks under treaty protocols or bilateral agreement mechanisms with the United States and others, which reshape how information is shared with law enforcement in foreign jurisdictions and what kinds of safeguards and mechanisms are applied to that framework to protect human rights. And I think as a really broad trend, what is probably most, the simplest way to put it is that what we're really seeing is a growing number of ways that borders are actually being exploited to the detriment of human rights standards. Rights are essentially falling through the cracks. This can happen either through cross-border joint investigations between agencies in multiple states in ways that essentially go forum shopping for the laws and the most locks, that's right. You can also see foreign states that seek to leverage cooperation tools in democratic states in order to track, surveil, or potentially even extradite human rights activists and dissidents, journalists that are living in exile outside their borders. And what this has really come out of is a discussion point that has been made really around the world that if crime is going to become more transient across borders, that law enforcement also needs to have a greater freedom to move more seamlessly across borders. But what often is left out of that framing is that human rights standards that are really deeply entrenched in our domestic law systems, they would also need to be concurrently meaningful across borders. And unfortunately, that's not what we're seeing. Canada is going to be facing decisions around this, both within the context of C2 and around it in the coming months and beyond, as we know that it has been considering and in negotiation around a couple of very controversial agreements. One of those, the sort of elephant in the room, so to speak, is that the legislation has been tabled at a time where we know that Canada and the United States have been in negotiations for actually a couple of years around a potential agreement called the CLOUD Act, which would quite literally cede Canada's sovereignty to the United States and law enforcement authorities and give them really a blanket opportunity to directly apply surveillance orders onto entities, both public and private in Canada?Nate Erskine-Smith06:16-07:46Well, so years in the making negotiations, but we are in a very different world with the United States today than we were two years ago. And I was just in, I was in Mexico City for a conference with parliamentarians across the Americas, and there were six Democratic congressmen and women there. One, Chuy Garcia represents Chicago district. He was telling me that he went up to ICE officials and they're masked and he is saying, identify yourself. And he's a congressman. He's saying, identify yourself. What's your ID? What's your badge number? They're hiding their ID and maintaining masks and they're refusing to identify who they are as law enforcement officials, ostensibly refusing to identify who they are to an American congressman. And if they're willing to refuse to identify themselves in that manner to a congressman. I can only imagine what is happening to people who don't have that kind of authority and standing in American life. And that's the context that I see this in now. I would have probably still been troubled to a degree with open data sharing and laxer standards on the human rights side, but all the more troubling, you talk about less democratic jurisdictions and authoritarian regimes. Well, isn't the U.S. itself a challenge today more than ever has been? And then shouldn't we maybe slam the pause button on negotiations like this? Well, you raise a number of really important points. And I think thatKate Robertson07:47-09:54there have been warning signs and worse that have long preceded the current administration and the backsliding that you're commenting upon since the beginning of 2025. Certainly, I spoke about the increasing trend of the exploitation of borders. I mean, I think we're seeing signs that really borders are actually, in essence, being used as a form of punishment, even in some respects, which I would say it is when you say to someone who would potentially exercise due process rights against deportation and say if you exercise those rights, you'll be deported to a different continent from your home country where your rights are perhaps less. And that's something that UN human rights authorities have been raising alarm bells about around the deportation of persons to third countries, potentially where they'll face risks of torture even. But these patterns are all too reminiscent of what we saw in the wake of 9-11 and the creation of black sites where individuals, including Canadian persons, were detained or even tortured. And really, this stems from a number of issues. But what we have identified in analyzing potential cloud agreement is really just the momentous decision that the Canadian government would have to make to concede sovereignty to a country which is in many ways a pariah for refusing to acknowledge extraterritorial international human rights obligations to persons outside of its borders. And so to invite that type of direct surveillance and exercise of authority within Canada's borders was a country who has refused for a very long time, unlike Canada and many other countries around the world, has refused to recognize through its courts and through its government any obligation to protect the international human rights of people in Canada.Nate Erskine-Smith09:56-10:21And yet, you wrote, some of the data and surveillance powers in Bill C-2 read like they could have been drafted by U.S. officials. So you take the frame that you're just articulating around with what the U.S. worldview is on this and has been and exacerbated by obviously the current administration. But I don't love the sound of it reading like it was drafted by AmericanKate Robertson10:22-12:43officials. Well, you know, it's always struck me as a really remarkable story, to be frank. You know, to borrow Dickens' tale of two countries, which is that since the 1990s, Canada's Supreme Court has been charting a fundamentally different course from the constitutional approach that's taken the United States around privacy and surveillance. And it really started with persons looking at what's happening and the way that technology evolves and how much insecurity people feel when they believe that surveillance is happening without any judicial oversight. And looking ahead and saying, you know what, if we take this approach, it's not going to go anywhere good. And that's a really remarkable decision that was made and has continued to be made by the court time and time again, even as recently as last year, the court has said we take a distinct approach from the United States. And it had a lot of foresight given, you know, in the 1990s, technology is nowhere near what it is today. Of course. And yet in the text of C2, we see provisions that, you know, I struggle when I hear proponents of the legislation describe it as balanced and in keeping with the Charter, when actually they're proposing to essentially flip the table on principles that have been enshrined for decades to protect Canadians, including, for example, the notion that third parties like private companies have the authority to voluntarily share our own. information with the police without any warrant. And that's actually the crux of what has become a fundamentally different approach that I think has really led Canada to be a more resilient country when it comes to technological change. And I sometimes describe us as a country that is showing the world that, you know, it's possible to do both. You can judicially supervise investigations that are effective and protect the public. And the sky does not fall if you do so. And right now we're literally seeing and see to something that I think is really unique and important made in Canada approach being potentially put on the chopping block.Nate Erskine-Smith12:44-13:29And for those listening who might think, okay, well, at a high level, I don't love expansive data sharing and reduced human rights protections, but practically, are there examples? And you pointed to in your writing right from the hop, the Arar case, and you mentioned the Supreme Court, but they, you know, they noted that it's a chilling example of the dangers of unconditional information sharing. And the commission noted to the potentially risky exercise of open ended, unconditional data sharing as well. But that's a real life example, a real life Canadian example of what can go wrong in a really horrible, tragic way when you don't have guardrails that focus and protect human rights.Kate Robertson13:31-14:56You're right to raise that example. I raise it. It's a really important one. It's one that is, I think, part of, you know, Canada has many commendable and important features to its framework, but it's not a perfect country by any means. That was an example of just information sharing with the United States itself that led to a Canadian citizen being rendered and tortured in a foreign country. Even a more recent example, we are not the only country that's received requests for cooperation from a foreign state in circumstances where a person's life is quite literally in jeopardy. We have known from public reporting that in the case of Hardeep Najjar, before he was ultimately assassinated on Canadian soil, an Interpol Red Notice had been issued about him at the request of the government of India. And the government had also requested his extradition. And we know that there's a number of important circumstances that have been commented upon by the federal government in the wake of those revelations. And it's provoked a really important discussion around the risks of foreign interference. But it is certainly an example where we know that cooperation requests have been made in respect of someone who's quite literally and tragically at risk of loss of life.Nate Erskine-Smith14:57-16:07And when it comes to the, what we're really talking about is, you mentioned the Cloud Act. There's also, I got to go to the notes because it's so arcane, but the second additional protocol to the Budapest Convention. These are, in that case, it's a treaty that Canada would ratify. And then this piece of legislation would in some way create implementing authorities for. I didn't fully appreciate this until going through that. And I'd be interested in your thoughts just in terms of the details of these. And we can make it as wonky as you like in terms of the challenges that these treaties offer. I think you've already articulated the watering down of traditional human rights protections and privacy protections we would understand in Canadian law. But the transparency piece, I didn't fully appreciate either. And as a parliamentarian, I probably should have because there's... Until reading your paper, I didn't know that there was a policy on tabling of treaties That really directs a process for introducing treaty implementing legislation. And this process also gets that entirely backwards.Kate Robertson16:09-17:01That's right. And, you know, in researching and studying what to do with, you know, what I foresee is potentially quite a mess if we were to enter into a treaty that binds us to standards that are unconstitutional. You know, that is a diplomatic nightmare of sorts, but it's also one that would create, you know, a constitutional entanglement of that's really, I think, unprecedented in Canada. But nevertheless, that problem is foreseen if one or both of these were to go ahead. And I refer to that in the cloud agreement or the 2AP. But this policy, as I understand it, I believe it was tabled by then Foreign Affairs Minister Maxime Bernier, as he was at the time, by Prime Minister Harper's government.Nate Erskine-Smith17:02-17:04He's come a long way.Kate Robertson17:07-18:12I believe that the rationale for the policy was quite self-evident at the time. I mean, if you think about the discussions that are happening right now, for example, in Quebec around digital sovereignty and the types of entanglements that U.S. legal process might impact around Quebec privacy legislation. Other issues around the AI space in Ontario or our health sector in terms of technology companies in Ontario. These treaties really have profound implications at a much broader scale than the federal government and law enforcement. And that's not even getting to Indigenous sovereignty issues. And so the policy is really trying to give a greater voice to the range of perspectives that a federal government would consider before binding Canada internationally on behalf of all of these layers of decision making without perhaps even consulting with Parliament First.Nate Erskine-Smith18:12-19:15So this is, I guess, one struggle. There's the specific concerns around watering down protections, but just on process. This just bothered me in particular because we're going to undergo this process in the fall. And so I printed out the Strong Borders Act, Government of Canada Strengthens Border Security and the backgrounder to the law. And going through it, it's six pages when I print it out. And it doesn't make mention of the Budapest Convention. It doesn't make mention of the Cloud Act. It doesn't make mention of any number of rationales for this legislation. But it doesn't make mention that this is in part, at least, to help implement treaties that are under active negotiation. not only gets backwards the policy, but one would have thought, especially I took from your paper, that the Department has subsequently, the Justice Department has subsequently acknowledged that this would in fact help the government implement these treaties. So surely it shouldKate Robertson19:15-19:57be in the background. I would have thought so. As someone that has been studying these treaty frameworks very carefully, it was immediately apparent to me that they're at least relevant. It was put in the briefing as a question as to whether or not the actual intent of some of these new proposed powers is to put Canada in a position to ratify this treaty. And the answer at that time was yes, that that is the intent of them. And it was also stated that other cooperation frameworks were foreseeable.Nate Erskine-Smith19:59-20:57What next? So here I am, one member of parliament, and oftentimes through these processes, we're going to, there's the objective of the bill, and then there's the details of the bill, and we're going to get this bill to a committee process. I understand the intention is for it to be a pretty fulsome committee hearing, and it's an omnibus bill. So what should happen is the asylum components should get kicked to the immigration committee. The pieces around national security should obviously get kicked to public safety committee, and there should be different committees that deal with their different constituent elements that are relevant to those committees. I don't know if it will work that way, but that would be a more rational way of engaging with a really broad ranging bill. Is there a fix for this though? So are there amendments that could cure it or is it foundationally a problem that is incurable?Kate Robertson20:58-21:59Well, I mean, I think that for myself as someone studying this area, it's obvious to me that what agreements may be struck would profoundly alter the implications of pretty much every aspect of this legislation. And that stems in part from just how fundamental it would be if Canada were to cede its sovereignty to US law enforcement agencies and potentially even national security agencies as well. But obviously, the provisions themselves are quite relevant to these frameworks. And so it's clear that Parliament needs to have the opportunity to study how these provisions would actually be used. And I am still left on knowing how that would be possible without transparencyNate Erskine-Smith22:00-22:05about what is at stake in terms of potential agreements. Right. What have we agreed to? If thisKate Robertson22:05-24:57is implementing legislation what are we implementing certainly it's a significantly different proposition now even parking the international data sharing context the constitutional issues that are raised in the parts of the bill that i'm able to study within my realm of expertise which is in the context of omnibus legislation not the entire bill of course yeah um but it's hard to even know where to begin um the the the powers that are being put forward you know i kind of have to set the table a bit to understand to explain why the table is being flipped yeah yeah we're at a time where um you know a number of years ago i published about the growing use of algorithms and AI and surveillance systems in Canada and gaps in the law and the need to bring Canada's oversight into the 21st century. Those gaps now, even five years later, are growing into chasms. And we've also had multiple investigative reports by the Privacy Commissioner of Canada being sent to Parliament about difficulties it's had reviewing the activities of law enforcement agencies, difficulties it's had with private sector companies who've been non-compliant with privacy legislation, and cooperating at all with the regulator. And we now have powers being put forward that would essentially say, for greater certainty, it's finders keepers rules. Anything in the public domain can be obtained and used by police without warrant. And while this has been put forward as a balancing of constitutional norms, the Supreme Court has said the opposite. It's not an all or nothing field. And in the context of commercial data brokers that are harvesting and selling our data, including mental health care that we might seek online, AI-fueled surveillance tools that are otherwise unchecked in the Canadian domain. I think this is a frankly stunning response to the context of the threats that we face. And I really think it sends and creates really problematic questions around what law enforcement and other government agencies are expected to do in the context of future privacy reviews when essentially everything that's been happening is supposedly being green lit with this new completely un-nuanced power. I should note you are certainly not alone in theseNate Erskine-Smith24:57-27:07concerns. I mean, in addition to the paper that I was talking about at the outset that you've written as an analyst that alongside Ron Deaver in the Citizen Lab. But there's another open letter you've signed that's called for the withdrawal of C2, but it's led by open media. I mean, BCCLA, British Columbia Civil Liberties Association, the Canadian Civil Liberties Association, the Canadian Council for Refugees, QP, International Civil Liberties Monitoring Group, Penn Canada, the Center for Free Expression, privacy experts like Colin Bennett, who I used be on the Privacy Committee and that were pretty regular witnesses. You mentioned the Privacy Commissioner has not signed the open letter, but the Privacy Commissioner of both Canada and the Information Commissioner of Ontario, who's also responsible for privacy. In the context of the treaties that you were mentioning, the Budapest Convention in particular, they had highlighted concerns absent updated, modernized legislation. And at the federal level, we have had in fits and starts attempts to modernize our private sector privacy legislation. But apart from a consultation paper at one point around the Privacy Act, which would apply to public sector organizations, there's really been no serious effort to table legislation or otherwise modernize that. So am I right to say, you know, we are creating a myriad number of problems with respect to watering down privacy and human rights protections domestically and especially in relation to foreign governments with relation to data of our citizens here. And we could potentially cure those problems, at least in part, if we modernize our privacy legislation and our privacy protections and human rights protections here at home. But we are, as you say, a gap to chasm. We are so woefully behind in that conversation. It's a bit of an odd thing to pass the open-ended data sharing and surveillance piece before you even have a conversation around updating your privacy protections.Kate Robertson27:07-28:13Yeah, I mean, frankly, odd, I would use the word irresponsible. We know that these tools, it's becoming increasingly well documented how impactful they are for communities and individuals, whether it's wrongful arrests, whether it's discriminatory algorithms. really fraught tools to say the least. And it's not as if Parliament does not have a critical role here. You know, in decades past, to use the example of surveillance within Quebec, which was ultimately found to have involved, you know, years of illegal activity and surveillance activities focused on political organizing in Quebec. And that led to Parliament striking an inquiry and ultimately overhauling the mandate of the RCMP. There were recommendations made that the RCMP needs to follow the law. That was an actual recommendation.Nate Erskine-Smith28:14-28:16I'm sorry that it needs to be said, but yeah.Kate Robertson28:16-29:05The safeguards around surveillance are about ensuring that when we use these powers, they're being used appropriately. And, you know, there isn't even, frankly, a guarantee that judicial oversight will enable this to happen. And it certainly provides comfort to many Canadians. But we know, for example, that there were phones being watched of journalists in Montreal with, unfortunately, judicial oversight not even that many years ago. So this is something that certainly is capable of leading to more abuses in Canada around political speech and online activity. And it's something that we need to be protective against and forward thinking about.Nate Erskine-Smith29:05-29:58Yeah, and the conversation has to hold at the same time considerations of public safety, of course, but also considerations for due process and privacy and human rights protections. These things, we have to do both. If we don't do both, then we're not the democratic society we hold ourselves out as. I said odd, you said irresponsible. You were forceful in your commentary, but the open letter that had a number of civil society organizations, I mentioned a few, was pretty clear to say the proposed legislation reflects little more than shameful appeasement of the dangerous rhetoric and false claims about our country emanating from the United States. It's a multi-pronged assault on the basic human rights and freedoms Canada holds dear. Got anything else to add?Kate Robertson30:00-30:56I mean, the elephant in the room is the context in which the legislation has been tabled within. And I do think that we're at a time where we are seeing democratic backsliding around the world, of course, and rising digital authoritarianism. And these standards really don't come out of the air. They're ones that need to be protected. And I do find myself, when I look at some of the really un-nuanced powers that are being put forward, I do find myself asking whether or not those risks are really front and center when we're proposing to move forward in this way. And I can only defer to experts from, as you said, hundreds of organizations that have called attention towards pretty much every aspect of this legislation.Nate Erskine-Smith30:57-31:44And I will have the benefit of engaging folks on the privacy side around lawful access and around concerns around changes to the asylum claim and due process from the Canadian Association of Refugee Lawyers. But as we do see this move its way through Parliament, if we see it move its way through Parliament in the fall, if they're recognizing that the call was for withdrawal, but also recognizing a political reality where if it is to pass, we want to make sure we are improving it as much as possible. If there are amendments along the way, if there are other people you think that I should engage with, please do let me know because this is before us. It's an important piece of legislation. And if it's not to be withdrawn, we better improve it as much as possible.Kate Robertson31:46-32:02I appreciate that offer and really commend you for covering the issue carefully. And I really look forward to more engagement from yourself and other colleagues in parliament as legislation is considered further. I expect you will be a witness at committee,Nate Erskine-Smith32:02-32:06but thanks very much for the time. I really appreciate it. Thanks for having me.Part 2: Adam SadinskyChapters:33:33 Concerns Over Asylum Eligibility in Canada36:30 Government Goals and Fairness for Refugee Claimants39:00 Changing Country Conditions and New Risks41:30 The Niagara Falls Example & Other Unfair Exclusions44:00 Frivolous vs. Legitimate Claims in the Refugee System47:00 Clearing the Backlog with Fair Pathways50:00 Broad Powers Granted to the Government52:00 Privacy Concerns and Closing ReflectionsNate Erskine-Smith33:33-33:35Adam, thanks for joining me.Adam Sadinsky33:35-33:36Thanks for having me, Nate.Nate Erskine-Smith33:36-33:57We've had a brief discussion about this, by way of my role as an MP, but, for those who are listening in, they'll have just heard a rundown of all the concerns that the Citizen Lab has with data surveillance and data sharing with law enforcement around the world. You've got different concerns about C2 and you represent the Canadian Association of Refugee Lawyers. What are your concerns here?Adam Sadinsky33:57-35:31I mean, our biggest concern with this bill is new provisions that create additional categories of folks ineligible to claim asylum in Canada. And specifically to have their hearings heard at the Immigration and Refugee Board. The biggest one of those categories is definitely, a bar on individuals making refugee claims in Canada one year after they have arrived in Canada, and that's one year, whether they have been in Canada for that whole year or they left at some point and came back. Those folks who have been here, who came more than a year ago, if they now fear persecution and want to make a claim for refugee protection, this bill would shunt them into an inferior system where rather than having a full hearing in their day in court.Their application will be decided by an officer of immigration, alone, sitting in the cubicle, probably, with some papers in front of them. That person is going to make an enormous decision about whether to send that person back home where they feared persecution, torture, death. Our position is that this new form of ineligibility. Is unfair. it doesn't meet the government's goals, as we understand them, and we share, we share the views of organizations like, Citizen Lab, that the bill should be withdrawn. There are other ways to do this, but this bill is fundamentally flawed.Nate Erskine-Smith35:31-35:57Let's talk about government goals. Those looking at the influx of temporary residents in Canada specifically, and I don't, and I don't wanna pick on international students, but we've seen a huge influx of international students just as one category example. And they've said, well, if someone's been here for a year and they didn't claim right away, they didn't come here to claim asylum. Because they would've claimed within that first year, presumably, you know, what's the problem with, uh, with a rule that is really trying to tackle this problem.Adam Sadinsky35:57-38:33The issue is, I mean, Nate, you had mentioned, you know, people who had come to Canada, they didn't initially claim and it didn't initially claim asylum, temporary residents. What do we do about it? I wanna give a couple of examples of people who would be caught by this provision, who fall into that category. But there's legitimate reasons why they might claim more than a year after arriving in Canada. The first is someone who came to Canada, student worker, whatever. At the time they came to Canada, they would've been safe going back home they didn't have a fear of returning back home. But country conditions change and they can change quickly. The Taliban takeover of Afghanistan in 2021, was a stark example there may have been people who came to Canada as students planning to go back to Afghanistan and rebuild their country. As the bill is currently written. If there were to be a situation like that, and there will be some other Afghanistan, there will be some other situation down the line. Those people who weren't afraid when they originally came to Canada and now have a legitimate claim, will have an inferior, process that they go through, one that is riddled with issues, examples of unfairness compared to the refugee, the regular refugee system, and a lack of protection from deportation, pending any appeal.So that's one category. A second category is people who were afraid of going back home when they came to Canada but didn't need to claim asylum because they had another avenue to remain in Canada. So the government advertised, Minister Frazier was saying this often come to Canada, come as a student and there's a well-established pathway. You'll have a study permit, you'll get a post-graduation work permit. This is what the government wanted. The rug has been pulled out from under many of those people. Towards the end of last year when Canada said, okay, it's enough, too many temporary residents. But what about the temporary residents who had a fear of returning home when they came? They went through the system the “right way,” quote unquote. They didn't go to the asylum system. they went through another path. And now they're looking at it. They say, well, you know, I came to Canada to study, but also I'm gay and I'm from a country where, if people know about that, you know, I'll be tortured. Maybe since they've been in Canada, that person in that example, they've been in a relationship, they've been posting on social media with their partner. It is very dangerous so why, why shouldn't that person claim refugee protection through regular means?Nate Erskine-Smith38:33-39:06Is this right on your read of the law as it is written right now, if someone were to come with their family when they're a kid and they were to be in Canada for over a year and then their family were to move back to either the home country or to a different country, and, they wake up as a teenager many years later, they wake up as an adult many years later and their country's falling apart, and they were to flee and come to Canada. By virtue of the fact they've been here for a year as a kid, would that preclude them from making a claim?Adam Sadinsky39:06-39:10It's even worse than that, Nate.Nate Erskine-Smith39:09-39:10Oh, great.Adam Sadinsky39:10-39:47In your example, the family stayed in Canada for more than a year. Yes, absolutely. That person is caught by this provision. But here's who else would be someone comes when they're five years old with their family, on a trip to the United States. during that trip, they decide we want to see the Canadian side of Niagara Falls. They either have a visa or get whatever visa they need, or don't need one. They visit the falls, and at that point that they enter Canada, a clock starts ticking. That never stops ticking. So maybe they came to Canada for two hours.Nate Erskine-Smith39:44-39:45Two hours and you're outta luck.Adam Sadinsky39:45-39:47They go back to the USNate Erskine-Smith39:47-39:47Oh man.Adam Sadinsky39:47-40:09They never come back to Canada again. The way that the bill is written, that clock never stops ticking, right? Their country falls apart. They come back 15 years later. That person is going to have a very different kind of process that they go through, to get protection in Canada, than someone who wouldn't be caught by this bill.Nate Erskine-Smith40:09-40:34Say those are the facts as they are, that's one category. There's another category where I've come as a student, I thought there would be a pathway. I don't really fear persecution in my home country, but I want to stay in Canada we see in this constituency office, as other constituency offices do people come with immigration help or they've got legitimate claims. We see some people come with help with illegitimate claimsAdam Sadinsky40:34-42:46We have to be very careful when we talk about categorizing claims as frivolous. There is no question people make refugee claims in Canada that have no merit. You'll not hear from me, you'll not hear from our organization saying that every 100% of refugee claims made in Canada, are with merit. The issue is how we determine. At that initial stage that you're saying, oh, let's, let's deal quickly with frivolous claims. How do you determine if a claim is frivolous? What if someone, you know, I do a lot of appeal work, we get appeals of claims prepared by immigration consultants, or not even immigration consultants. And, you know, there's a core of a very strong refugee claim there that wasn't prepared properly.Nate Erskine-Smith42:46-42:46Yeah, we see it too. That's a good point.Adam Sadinsky42:46-42:46How that claim was prepared has nothing to do with what the person actually faces back home. We have to be very careful in terms of, quick negative claims, and clearing the decks of what some might think are frivolous claims. But there may be some legitimate and very strong core there. What could be done, and you alluded to this, is there are significant claims in the refugee board's backlog that are very, very strong just based on the countries they come from or the profiles of the individuals who have made those claims, where there are countries that have 99% success rate. And that's not because the board is super generous. It's because the conditions in those countries are very, very bad. And so the government could implement policies and this would be done without legislation to grant pathways for folks from, for example, Eritrea 99ish percent success rate. However, the government wants to deal with that in terms of numbers, but there's no need for the board to spend time determining whether this claim is in the 1%, that doesn't deserve to be accepted. Our view is that 1% being accepted is, a trade off for, a more efficient system.Nate Erskine-Smith42:46-43:30Similarly though, individuals who come into my office and they've been here for more than five years. They have been strong contributors to the community. They have jobs. They're oftentimes connected to a faith organization. They're certainly connected to a community based organization that is going to bat for them. There's, you know, obviously no criminal record in many cases they have other family here. And they've gone through so many appeals at different times. I look at that and I go, throughout Canadian history, there have been different regularization programs. Couldn't you kick a ton of people not a country specific basis, but a category specific basis of over five years, economic contributions, community contributions, no criminal record, you're approved.Adam Sadinsky43:30-44:20Yeah, I'd add to your list of categories, folks who are working in, professions, that Canada needs workers in. give the example of construction. We are facing a housing crisis. So many construction workers are not Canadian. Many of my clients who are refugee claimants waiting for their hearings are working in the construction industry. And the government did that, back in the COVID pandemic, creating what was, what became known as the Guardian Angels Program, where folks who were working in the healthcare sector, on the front lines, combating the pandemic, supporting, folks who needed it, that they were allowed to be taken again out of the refugee queue with a designated, pathway to permanent residents on the basis of the work and the contribution they were doing. All of these could be done.Adam Sadinsky44:20-45:05The refugee system is built on Canada's international obligations under the refugee convention, to claim refugee protection, to claim asylum is a human right. Every person in the world has the right to claim asylum. Individuals who are claiming asylum in Canada are exercising that right. Each individual has their own claim, and that's the real value that the refugee board brings to bear and why Canada has had a gold standard. The refugee system, replicated, around the world, every individual has their day in court, to explain to an expert tribunal why they face persecution. This bill would take that away.Nate Erskine-Smith45:05-46:18Yeah, I can't put my finger on what the other rationale would be though, because why the, why this change now? Well, we have right now, a huge number over a million people who are going to eventually be without status because they're not gonna have a pathway that was originally, that they originally thought would be there. The one frustration I have sometimes in the system is there are people who have come into my office with, the original claim, being unfounded. But then I look at it, and they've been here partly because the process took so long, they've been here for over five years. If you've been here for over five years and you're contributing and you're a member of the community, and now we're gonna kick you out. Like your original claim might have been unfounded, but this is insane. Now you're contributing to this country, and what a broken system. So I guess I'm sympathetic to the need for speed at the front end to ensure that unfounded claims are deemed unfounded and people are deported and legitimate claims are deemed founded, and they can be welcomed. So cases don't continue to come into my office that are over five or over six years long where I go, I don't even care if it was originally unfounded or not. Welcome to Canada. You've been contributing here for six years anyway.Adam Sadinsky46:18-46:33But if I can interject? Even if the bill passes as written, each of these individuals is still going to have what's called a pre-removal risk assessment.Nate Erskine-Smith46:31-46:33They're still gonna have a process. Yeah, exactly.Adam Sadinsky46:33-46:55They're still gonna have a process, and they're still going to wait time. All these people are still in the system. The bill is a bit of a shell game where folks are being just transferred from one process to another and say, oh, wow. Great. Look, we've reduced the backlog at the IRB by however many thousand claims,Nate Erskine-Smith46:53-46:55And we've increased the backlog in the process.Adam Sadinsky46:55-48:25Oh, look at the wait time at IRCC, and I'm sure you have constituents who come into your office and say, I filed a spousal sponsorship application two and a half years ago. I'm waiting for my spouse to come and it's taking so long. IRCC is not immune from processing delays. There doesn't seem to be, along with this bill, a corresponding hiring of hundreds and hundreds more pro officers. So, this backlog and this number of claims is shifting from one place to another. And another point I mentioned earlier within the refugee system within the board, when a person appeals a negative decision, right? Because, humans make decisions and humans make mistakes. And that's why we have legislative appeal processes in the system to allow for mistakes to be corrected. That appeal process happens within the board, and a person is protected from deportation while they're appealing with a pro. With this other system, it's different. The moment that an officer makes a negative decision on a pro that person is now eligible to be deported. CBSA can ask them to show up the next day and get on a plane and go home. Yes, a person can apply for judicial review in the federal court that does not stop their deportation. If they can bring a motion to the court for a stay of removal.Nate Erskine-Smith48:19-48:25You're gonna see a ton of new work for the federal court. You are gonna see double the work for the federal courtAdam Sadinsky48:25-48:39Which is already overburdened. So unless the government is also appointing many, many new judges, and probably hiring more Council Department of Justice, this backlog is going to move from one place to another.Nate Erskine-Smith48:39-48:41It's just gonna be industry whack-a-mole with the backlog.Adam Sadinsky48:41-48:52The only way to clear the backlog is to clear people out of it. There's no fair way to clear folks out of it in a negative way. So the only way to do that is positively.Nate Erskine-Smith48:52-49:37In the limited time we got left, the bill also empowers the governor and council of the cabinet to cancel documents, to suspend documents. And just so I've got this clearer in my mind, so if, for example: say one is a say, one is a student on campus, or say one is on a, on a work permit and one is involved in a protest, and that protest the government deems to be something they don't like. The government could cancel the student's permit on the basis that they were involved in the protest. Is that right? The law? Not to say that this government would do that. But this would allow the government to legally do just that. Am I reading it wrong?Adam Sadinsky49:37-50:46The bill gives broad powers to the government to cancel documents. I think you're reading it correctly. To me, when I read the bill, I don't particularly understand exactly what is envisioned. Where it would, where the government would do this, why a government would want to put this in. But you are right. I would hope this government would not do that, but this government is not going to be in power forever. When you put laws on the books, they can be used by whomever for whatever reason they can they want, that's within how that law is drafted. You know, we saw down south, you know, the secretary of State a few months ago said, okay, we're gonna cancel the permits of everyone from South Sudan, in the US because they're not taking back people being deported. It's hugely problematic. It's a complete overreach. It seems like there could be regulations that are brought in. But the power is so broad as written in this law, that it could definitely be used, for purposes most Canadians would not support.Nate Erskine-Smith50:46-51:07And, obviously that's a worst case scenario when we think about the United States in today's political climate. But, it's not clear to your point what the powers are necessary for. If we are to provide additional powers, we should only provide power as much as necessary and proportionate to the goal we want to achieve. Is there anything else you want to add?Adam Sadinsky51:07-51:43I just wanna touch, and I'm sure you got into a lot of these issues, on the privacy side but. The privacy issues in this bill bleed over into the refugee system with broad search powers, um, particularly requiring service providers to provide information, we are concerned these powers could be used by CBSA, for example, to ask a women's shelter, to hand over information about a woman claiming refugee protection or who's undocumented, living in a shelter, we have huge concerns that, you know, these powers will not just be used by police, but also by Canada Border Services and immigration enforcement. I'm not the expert on privacy issues, but we see it we see the specter of those issues as well.Nate Erskine-Smith51:43-52:22That's all the time we got, but in terms of what would help me to inform my own advocacy going forward is, this bill is gonna get to committee. I'm gonna support the bill in committee and see if we can amend it. I know, the position of CARL is withdraw. The position of a number of civil society organizations is to withdraw it. I think it's constructive to have your voice and others at committee, and to make the same arguments you made today with me. Where you have. I know your argument's gonna be withdrawn, you'll say then in the alternative, here are changes that should be made. When you've got a list of those changes in detailed, legislative amendment form, flip them to me and I'll share the ideas around the ministry and around with colleagues, and I appreciate the time. Appreciate the advocacy.Adam Sadinsky52:22-52:24Absolutely. Thank you. This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit www.uncommons.ca

Uncommons with Nate Erskine-Smith
The Future of Online Harms and AI Regulation with Taylor Owen

Uncommons with Nate Erskine-Smith

Play Episode Listen Later Sep 26, 2025 39:00


After a hiatus, we've officially restarted the Uncommons podcast, and our first long-form interview is with Professor Taylor Owen to discuss the ever changing landscape of the digital world, the fast emergence of AI and the implications for our kids, consumer safety and our democracy.Taylor Owen's work focuses on the intersection of media, technology and public policy and can be found at taylorowen.com. He is the Beaverbrook Chair in Media, Ethics and Communications and the founding Director of The Centre for Media, Technology and Democracy at McGill University where he is also an Associate Professor. He is the host of the Globe and Mail's Machines Like Us podcast and author of several books.Taylor also joined me for this discussion more than 5 years ago now. And a lot has happened in that time.Upcoming episodes will include guests Tanya Talaga and an episode focused on the border bill C-2, with experts from The Citizen Lab and the Canadian Association of Refugee Lawyers.We'll also be hosting a live event at the Naval Club of Toronto with Catherine McKenna, who will be launching her new book Run Like a Girl. Register for free through Eventbrite. As always, if you have ideas for future guests or topics, email us at info@beynate.ca Chapters:0:29 Setting the Stage1:44 Core Problems & Challenges4:31 Information Ecosystem Crisis10:19 Signals of Reliability & Policy Challenges14:33 Legislative Efforts18:29 Online Harms Act Deep Dive25:31 AI Fraud29:38 Platform Responsibility32:55 Future Policy DirectionFurther Reading and Listening:Public rules for big tech platforms with Taylor Owen — Uncommons Podcast“How the Next Government can Protect Canada's Information Ecosystem.” Taylor Owen with Helen Hayes, The Globe and Mail, April 7, 2025.Machines Like Us PodcastBill C-63Transcript:Nate Erskine-Smith00:00-00:43Welcome to Uncommons, I'm Nate Erskine-Smith. This is our first episode back after a bit of a hiatus, and we are back with a conversation focused on AI safety, digital governance, and all of the challenges with regulating the internet. I'm joined by Professor Taylor Owen. He's an expert in these issues. He's been writing about these issues for many years. I actually had him on this podcast more than five years ago, and he's been a huge part of getting us in Canada to where we are today. And it's up to this government to get us across the finish line, and that's what we talk about. Taylor, thanks for joining me. Thanks for having me. So this feels like deja vu all over again, because I was going back before you arrived this morning and you joined this podcast in April of 2020 to talk about platform governance.Taylor Owen00:43-00:44It's a different world.Taylor00:45-00:45In some ways.Nate Erskine-Smith00:45-01:14Yeah. Well, yeah, a different world for sure in many ways, but also the same challenges in some ways too. Additional challenges, of course. But I feel like in some ways we've come a long way because there's been lots of consultation. There have been some legislative attempts at least, but also we haven't really accomplished the thing. So let's talk about set the stage. Some of the same challenges from five years ago, but some new challenges. What are the challenges? What are the problems we're trying to solve? Yeah, I mean, many of them are the same, right?Taylor Owen01:14-03:06I mean, this is part of the technology moves fast. But when you look at the range of things citizens are concerned about when they and their children and their friends and their families use these sets of digital technologies that shape so much of our lives, many things are the same. So they're worried about safety. They're worried about algorithmic content and how that's feeding into what they believe and what they think. They're worried about polarization. We're worried about the integrity of our democracy and our elections. We're worried about sort of some of the more acute harms of like real risks to safety, right? Like children taking their own lives and violence erupting, political violence emerging. Like these things have always been present as a part of our digital lives. And that's what we were concerned about five years ago, right? When we talked about those harms, that was roughly the list. Now, the technologies we were talking about at the time were largely social media platforms, right? So that was the main way five years ago that we shared, consumed information in our digital politics and our digital public lives. And that is what's changing slightly. Now, those are still prominent, right? We're still on TikTok and Instagram and Facebook to a certain degree. But we do now have a new layer of AI and particularly chatbots. And I think a big question we face in this conversation in this, like, how do we develop policies that maximize the benefits of digital technologies and minimize the harms, which is all this is trying to do. Do we need new tools for AI or some of the things we worked on for so many years to get right, the still the right tools for this new set of technologies with chatbots and various consumer facing AI interfaces?Nate Erskine-Smith03:07-03:55My line in politics has always been, especially around privacy protections, that we are increasingly living our lives online. And especially, you know, my kids are growing up online and our laws need to reflect that reality. All of the challenges you've articulated to varying degrees exist in offline spaces, but can be incredibly hard. The rules we have can be incredibly hard to enforce at a minimum in the online space. And then some rules are not entirely fit for purpose and they need to be updated in the online space. It's interesting. I was reading a recent op-ed of yours, but also some of the research you've done. This really stood out. So you've got the Hogue Commission that says disinformation is the single biggest threat to our democracy. That's worth pausing on.Taylor Owen03:55-04:31Yeah, exactly. Like the commission that spent a year at the request of all political parties in parliament, at the urging of the opposition party, so it spent a year looking at a wide range of threats to our democratic systems that everybody was concerned about originating in foreign countries. And the conclusion of that was that the single biggest threat to our democracy is the way information flows through our society and how we're not governing it. Like that is a remarkable statement and it kind of came and went. And I don't know why we moved off from that so fast.Nate Erskine-Smith04:31-05:17Well, and there's a lot to pull apart there because you've got purposeful, intentional, bad actors, foreign influence operations. But you also have a really core challenge of just the reliability and credibility of the information ecosystem. So you have Facebook, Instagram through Meta block news in Canada. And your research, this was the stat that stood out. Don't want to put you in and say like, what do we do? Okay. So there's, you say 11 million views of news have been lost as a consequence of that blocking. Okay. That's one piece of information people should know. Yeah. But at the same time.Taylor Owen05:17-05:17A day. Yeah.Nate Erskine-Smith05:18-05:18So right.Taylor Owen05:18-05:2711 million views a day. And we should sometimes we go through these things really fast. It's huge. Again, Facebook decides to block news. 40 million people in Canada. Yeah.Taylor05:27-05:29So 11 million times a Canadian.Taylor Owen05:29-05:45And what that means is 11 million times a Canadian would open one of their news feeds and see Canadian journalism is taken out of the ecosystem. And it was replaced by something. People aren't using these tools less. So that journalism was replaced by something else.Taylor05:45-05:45Okay.Taylor Owen05:45-05:46So that's just it.Nate Erskine-Smith05:46-06:04So on the one side, we've got 11 million views a day lost. Yeah. And on the other side, Canadians, the majority of Canadians get their news from social media. But when the Canadians who get their news from social media are asked where they get it from, they still say Instagram and Facebook. But there's no news there. Right.Taylor Owen06:04-06:04They say they get.Nate Erskine-Smith06:04-06:05It doesn't make any sense.Taylor Owen06:06-06:23It doesn't and it does. It's terrible. They ask Canadians, like, where do you get people who use social media to get their news? Where do they get their news? and they still say social media, even though it's not there. Journalism isn't there. Journalism isn't there. And I think one of the explanations— Traditional journalism. There is—Taylor06:23-06:23There is—Taylor Owen06:23-06:47Well, this is what I was going to get at, right? Like, there is—one, I think, conclusion is that people don't equate journalism with news about the world. There's not a one-to-one relationship there. Like, journalism is one provider of news, but so are influencers, so are podcasts, people listening to this. Like this would be labeled probably news in people's.Nate Erskine-Smith06:47-06:48Can't trust the thing we say.Taylor Owen06:48-07:05Right. And like, and neither of us are journalists, right? But we are providing information about the world. And if it shows up in people's feeds, as I'm sure it will, like that probably gets labeled in people's minds as news, right? As opposed to pure entertainment, as entertaining as you are.Nate Erskine-Smith07:05-07:06It's public affairs content.Taylor Owen07:06-07:39Exactly. So that's one thing that's happening. The other is that there's a generation of creators that are stepping into this ecosystem to both fill that void and that can use these tools much more effectively. So in the last election, we found that of all the information consumed about the election, 50% of it was created by creators. 50% of the engagement on the election was from creators. Guess what it was for journalists, for journalism? Like 5%. Well, you're more pessimistic though. I shouldn't have led with the question. 20%.Taylor07:39-07:39Okay.Taylor Owen07:39-07:56So all of journalism combined in the entire country, 20 percent of engagement, influencers, 50 percent in the last election. So like we've shifted, at least on social, the actors and people and institutions that are fostering our public.Nate Erskine-Smith07:56-08:09Is there a middle ground here where you take some people that play an influencer type role but also would consider themselves citizen journalists in a way? How do you – It's a super interesting question, right?Taylor Owen08:09-08:31Like who – when are these people doing journalism? When are they doing acts of journalism? Like someone can be – do journalism and 90% of the time do something else, right? And then like maybe they reveal something or they tell an interesting story that resonates with people or they interview somebody and it's revelatory and it's a journalistic act, right?Taylor08:31-08:34Like this is kind of a journalistic act we're playing here.Taylor Owen08:35-08:49So I don't think – I think these lines are gray. but I mean there's some other underlying things here which like it matters if I think if journalistic institutions go away entirely right like that's probably not a good thing yeah I mean that's whyNate Erskine-Smith08:49-09:30I say it's terrifying is there's a there's a lot of good in the in the digital space that is trying to be there's creative destruction there's a lot of work to provide people a direct sense of news that isn't that filter that people may mistrust in traditional media. Having said that, so many resources and there's so much history to these institutions and there's a real ethics to journalism and journalists take their craft seriously in terms of the pursuit of truth. Absolutely. And losing that access, losing the accessibility to that is devastating for democracy. I think so.Taylor Owen09:30-09:49And I think the bigger frame of that for me is a democracy needs signals of – we need – as citizens in a democracy, we need signals of reliability. Like we need to know broadly, and we're not always going to agree on it, but like what kind of information we can trust and how we evaluate whether we trust it.Nate Erskine-Smith09:49-10:13And that's what – that is really going away. Pause for a sec. So you could imagine signals of reliability is a good phrase. what does it mean for a legislator when it comes to putting a rule in place? Because you could imagine, you could have a Blade Runner kind of rule that says you've got to distinguish between something that is human generatedTaylor10:13-10:14and something that is machine generated.Nate Erskine-Smith10:15-10:26That seems straightforward enough. It's a lot harder if you're trying to distinguish between Taylor, what you're saying is credible, and Nate, what you're saying is not credible,Taylor10:27-10:27which is probably true.Nate Erskine-Smith10:28-10:33But how do you have a signal of reliability in a different kind of content?Taylor Owen10:34-13:12I mean, we're getting into like a journalistic journalism policy here to a certain degree, right? And it's a wicked problem because the primary role of journalism is to hold you personally to account. And you setting rules for what they can and can't do and how they can and can't behave touches on some real like third rails here, right? It's fraught. However, I don't think it should ever be about policy determining what can and can't be said or what is and isn't journalism. The real problem is the distribution mechanism and the incentives within it. So a great example and a horrible example happened last week, right? So Charlie Kirk gets assassinated. I don't know if you opened a feed in the few days after that, but it was a horrendous place, right? Social media was an awful, awful, awful place because what you saw in that feed was the clearest demonstration I've ever seen in a decade of looking at this of how those algorithmic feeds have become radicalized. Like all you saw on every platform was the worst possible representations of every view. Right. Right. It was truly shocking and horrendous. Like people defending the murder and people calling for the murder of leftists and like on both sides. Right. people blaming Israel, people, whatever. Right. And that isn't a function of like- Aaron Charlie Kirk to Jesus. Sure. Like- It was bonkers all the way around. Totally bonkers, right? And that is a function of how those ecosystems are designed and the incentives within them. It's not a function of like there was journalism being produced about that. Like New York Times, citizens were doing good content about what was happening. It was like a moment of uncertainty and journalism was doing or playing a role, but it wasn't And so I think with all of these questions, including the online harms ones, and I think how we step into an AI governance conversation, the focus always has to be on those systems. I'm like, what is who and what and what are the incentives and the technical decisions being made that determine what we experience when we open these products? These are commercial products that we're choosing to consume. And when we open them, a whole host of business and design and technical decisions and human decisions shape the effect it has on us as people, the effect it has on our democracy, the vulnerabilities that exist in our democracy, the way foreign actors or hostile actors can take advantage of them, right? Like all of that stuff we've been talking about, the role reliability of information plays, like these algorithms could be tweaked for reliable versus unreliable content, right? Over time.Taylor13:12-13:15That's not a – instead of reactionary –Taylor Owen13:15-13:42Or like what's most – it gets most engagement or what makes you feel the most angry, which is largely what's driving X, for example, right now, right? You can torque all those things. Now, I don't think we want government telling companies how they have to torque it. But we can slightly tweak the incentives to get better content, more reliable content, less polarizing content, less hateful content, less harmful content, right? Those dials can be incentivized to be turned. And that's where the policy space should play, I think.Nate Erskine-Smith13:43-14:12And your focus on systems and assessing risks with systems. I think that's the right place to play. I mean, we've seen legislative efforts. You've got the three pieces in Canada. You've got online harms. You've got the privacy and very kind of vague initial foray into AI regs, which we can get to. And then a cybersecurity piece. And all of those ultimately died on the order paper. Yeah. We also had the journalistic protection policies, right, that the previous government did.Taylor Owen14:12-14:23I mean – Yeah, yeah, yeah. We can debate their merits. Yeah. But there was considerable effort put into backstopping the institutions of journalism by the – Well, they're twofold, right?Nate Erskine-Smith14:23-14:33There's the tax credit piece, sort of financial support. And then there was the Online News Act. Right. Which was trying to pull some dollars out of the platforms to pay for the news as well. Exactly.Taylor14:33-14:35So the sort of supply and demand side thing, right?Nate Erskine-Smith14:35-14:38There's the digital service tax, which is no longer a thing.Taylor Owen14:40-14:52Although it still is a piece of past legislation. Yeah, yeah, yeah. It still is a thing. Yeah, yeah. Until you guys decide whether to negate the thing you did last year or not, right? Yeah.Nate Erskine-Smith14:52-14:55I don't take full responsibility for that one.Taylor Owen14:55-14:56No, you shouldn't.Nate Erskine-Smith14:58-16:03But other countries have seen more success. Yeah. And so you've got in the UK, in Australia, the EU really has led the way. 2018, the EU passes GDPR, which is a privacy set of rules, which we are still behind seven years later. But you've got in 2022, 2023, you've got Digital Services Act that passes. You've got Digital Markets Act. And as I understand it, and we've had, you know, we've both been involved in international work on this. And we've heard from folks like Francis Hogan and others about the need for risk-based assessments. And you're well down the rabbit hole on this. But isn't it at a high level? You deploy a technology. You've got to identify material risks. You then have to take reasonable measures to mitigate those risks. That's effectively the duty of care built in. And then ideally, you've got the ability for third parties, either civil society or some public office that has the ability to audit whether you have adequately identified and disclosed material risks and whether you have taken reasonable steps to mitigate.Taylor Owen16:04-16:05That's like how I have it in my head.Nate Erskine-Smith16:05-16:06I mean, that's it.Taylor Owen16:08-16:14Write it down. Fill in the legislation. Well, I mean, that process happened. I know. That's right. I know.Nate Erskine-Smith16:14-16:25Exactly. Which people, I want to get to that because C63 gets us a large part of the way there. I think so. And yet has been sort of like cast aside.Taylor Owen16:25-17:39Exactly. Let's touch on that. But I do think what you described as the online harms piece of this governance agenda. When you look at what the EU has done, they have put in place the various building blocks for what a broad digital governance agenda might look like. Because the reality of this space, which we talked about last time, and it's the thing that's infuriating about digital policy, is that you can't do one thing. There's no – digital economy and our digital lives are so vast and the incentives and the effect they have on society is so broad that there's no one solution. So anyone who tells you fix privacy policy and you'll fix all the digital problems we just talked about are full of it. Anyone who says competition policy, like break up the companies, will solve all of these problems. is wrong, right? Anyone who says online harms policy, which we'll talk about, fixes everything is wrong. You have to do all of them. And Europe has, right? They updated their privacy policy. They've been to build a big online harms agenda. They updated their competition regime. And they're also doing some AI policy too, right? So like you need comprehensive approaches, which is not an easy thing to do, right? It means doing three big things all over.Nate Erskine-Smith17:39-17:41Especially minority parlance, short periods of time, legislatively.Taylor Owen17:41-18:20Different countries have taken different pieces of it. Now, on the online harms piece, which is what the previous government took really seriously, and I think it's worth putting a point on that, right, that when we talked last was the beginning of this process. After we spoke, there was a national expert panel. There were 20 consultations. There were four citizens' assemblies. There was a national commission, right? Like a lot of work went into looking at what every other country had done because this is a really wicked, difficult problem and trying to learn from what Europe, Australia and the UK had all done. And we kind of taking the benefit of being late, right? So they were all ahead of us.Taylor18:21-18:25People you work with on that grant committee. We're all quick and do our own consultations.Taylor Owen18:26-19:40Exactly. And like the model that was developed out of that, I think, was the best model of any of those countries. And it's now seen as internationally, interestingly, as the new sort of milestone that everybody else is building on, right? And what it does is it says if you're going to launch a digital product, right, like a consumer-facing product in Canada, you need to assess risk. And you need to assess risk on these broad categories of harms that we have decided as legislators we care about or you've decided as legislators you cared about, right? Child safety, child sexual abuse material, fomenting violence and extremist content, right? Like things that are like broad categories that we've said are we think are harmful to our democracy. All you have to do as a company is a broad assessment of what could go wrong with your product. If you find something could go wrong, so let's say, for example, let's use a tangible example. Let's say you are a social media platform and you are launching a product that's going to be used by kids and it allows adults to contact kids without parental consent or without kids opting into being a friend. What could go wrong with that?Nate Erskine-Smith19:40-19:40Yeah.Taylor19:40-19:43Like what could go wrong? Yeah, a lot could go wrong.Taylor Owen19:43-20:27And maybe strange men will approach teenage girls. Maybe, right? Like if you do a risk assessment, that is something you might find. You would then be obligated to mitigate that risk and show how you've mitigated it, right? Like you put in a policy in place to show how you're mitigating it. And then you have to share data about how these tools are used so that we can monitor, publics and researchers can monitor whether that mitigation strategy worked. That's it. In that case, that feature was launched by Instagram in Canada without any risk assessment, without any safety evaluation. And we know there was like a widespread problem of teenage girls being harassed by strange older men.Taylor20:28-20:29Incredibly creepy.Taylor Owen20:29-20:37A very easy, but not like a super illegal thing, not something that would be caught by the criminal code, but a harm we can all admit is a problem.Taylor20:37-20:41And this kind of mechanism would have just filtered out.Taylor Owen20:41-20:51Default settings, right? And doing thinking a bit before you launch a product in a country about what kind of broad risks might emerge when it's launched and being held accountable to do it for doing that.Nate Erskine-Smith20:52-21:05Yeah, I quite like the we I mean, maybe you've got a better read of this, but in the UK, California has pursued this. I was looking at recently, Elizabeth Denham is now the Jersey Information Commissioner or something like that.Taylor Owen21:05-21:06I know it's just yeah.Nate Erskine-Smith21:07-21:57I don't random. I don't know. But she is a Canadian, for those who don't know Elizabeth Denham. And she was the information commissioner in the UK. And she oversaw the implementation of the first age-appropriate design code. That always struck me as an incredibly useful approach. In that even outside of social media platforms, even outside of AI, take a product like Roblox, where tons of kids use it. And just forcing companies to ensure that the default settings are prioritizing child safety so that you don't put the onus on parents and kids to figure out each of these different games and platforms. In a previous world of consumer protection, offline, it would have been de facto. Of course we've prioritized consumer safety first and foremost. But in the online world, it's like an afterthought.Taylor Owen21:58-24:25Well, when you say consumer safety, it's worth like referring back to what we mean. Like a duty of care can seem like an obscure concept. But your lawyer is a real thing, right? Like you walk into a store. I walk into your office. I have an expectation that the bookshelves aren't going to fall off the wall and kill me, right? And you have to bolt them into the wall because of that, right? Like that is a duty of care that you have for me when I walk into your public space or private space. Like that's all we're talking about here. And the age-appropriate design code, yes, like sort of developed, implemented by a Canadian in the UK. And what it says, it also was embedded in the Online Harms Act, right? If we'd passed that last year, we would be implementing an age-appropriate design code as we speak, right? What that would say is any product that is likely to be used by a kid needs to do a set of additional things, not just these risk assessments, right? But we think like kids don't have the same rights as adults. We have different duties to protect kids as adults, right? So maybe they should do an extra set of things for their digital products. And it includes things like no behavioral targeting, no advertising, no data collection, no sexual adult content, right? Like kind of things that like – Seem obvious. And if you're now a child in the UK and you open – you go on a digital product, you are safer because you have an age-appropriate design code governing your experience online. Canadian kids don't have that because that bill didn't pass, right? So like there's consequences to this stuff. and I get really frustrated now when I see the conversation sort of pivoting to AI for example right like all we're supposed to care about is AI adoption and all the amazing things AI is going to do to transform our world which are probably real right like not discounting its power and just move on from all of these both problems and solutions that have been developed to a set of challenges that both still exist on social platforms like they haven't gone away people are still using these tools and the harms still exist and probably are applicable to this next set of technologies as well. So this moving on from what we've learned and the work that's been done is just to the people working in this space and like the wide stakeholders in this country who care about this stuff and working on it. It just, it feels like you say deja vu at the beginning and it is deja vu, but it's kind of worse, right? Cause it's like deja vu and then ignoring theTaylor24:25-24:29five years of work. Yeah, deja vu if we were doing it again. Right. We're not even, we're not evenTaylor Owen24:29-24:41Well, yeah. I mean, hopefully I actually am not, I'm actually optimistic, I would say that we will, because I actually think of if for a few reasons, like one, citizens want it, right? Like.Nate Erskine-Smith24:41-24:57Yeah, I was surprised on the, so you mentioned there that the rules that we design, the risk assessment framework really applied to social media could equally be applied to deliver AI safety and it could be applied to new technology in a useful way.Taylor Owen24:58-24:58Some elements of it. Exactly.Nate Erskine-Smith24:58-25:25I think AI safety is a broad bucket of things. So let's get to that a little bit because I want to pull the pieces together. So I had a constituent come in the office and he is really like super mad. He's super mad. Why is he mad? Does that happen very often? Do people be mad when they walk into this office? Not as often as you think, to be honest. Not as often as you think. And he's mad because he believes Mark Carney ripped him off.Taylor Owen25:25-25:25Okay.Nate Erskine-Smith25:25-26:36Okay. Yep. He believes Mark Carney ripped him off, not with broken promise in politics, not because he said one thing and is delivering something else, nothing to do with politics. He saw a video online, Mark Carney told him to invest money. He invested money and he's out the 200 bucks or whatever it was. And I was like, how could you possibly have lost money in this way? This is like, this was obviously a scam. Like what, how could you have been deceived? But then I go and I watched the video And it is, okay, I'm not gonna send the 200 bucks and I've grown up with the internet, but I can see how- Absolutely. In the same way, phone scams and Nigerian princes and all of that have their own success rate. I mean, this was a very believable video that was obviously AI generated. So we are going to see rampant fraud. If we aren't already, we are going to see many challenges with respect to AI safety. What over and above the risk assessment piece, what do we do to address these challenges?Taylor Owen26:37-27:04So that is a huge problem, right? Like the AI fraud, AI video fraud is a huge challenge. In the election, when we were monitoring the last election, by far the biggest problem or vulnerability of the election was a AI generated video campaign. that every day would take videos of Polyevs and Carney's speeches from the day before and generate, like morph them into conversations about investment strategies.Taylor27:05-27:07And it was driving people to a crypto scam.Taylor Owen27:08-27:11But it was torquing the political discourse.Taylor27:11-27:11That's what it must have been.Taylor Owen27:12-27:33I mean, there's other cases of this, but that's probably, and it was running rampant on particularly meta platforms. They were flagged. They did nothing about it. There were thousands of these videos circulating throughout the entire election, right? And it's not like the end of the world, right? Like nobody – but it torqued our political debate. It ripped off some people. And these kinds of scams are –Taylor27:33-27:38It's clearly illegal. It's clearly illegal. It probably breaks his election law too, misrepresenting a political figure, right?Taylor Owen27:38-27:54So I think there's probably an Elections Canada response to this that's needed. And it's fraud. And it's fraud, absolutely. So what do you do about that, right? And the head of the Canadian Banking Association said there's like billions of dollars in AI-based fraud in the Canadian economy right now. Right? So it's a big problem.Taylor27:54-27:55Yeah.Taylor Owen27:55-28:46I actually think there's like a very tangible policy solution. You put these consumer-facing AI products into the Online Harms Act framework, right? And then you add fraud and AI scams as a category of harm. And all of a sudden, if you're meta and you are operating in Canada during an election, you'd have to do a risk assessment on like AI fraud potential of your product. Responsibility for your platform. And then it starts to circulate. We would see it. They'd be called out on it. They'd have to take it down. And like that's that, right? Like so that we have mechanisms for dealing with this. But it does mean evolving what we worked on over the past five years, these like only harms risk assessment models and bringing in some of the consumer facing AI, both products and related harms into the framework.Nate Erskine-Smith28:47-30:18To put it a different way, I mean, so this is years ago now that we had this, you know, grand committee in the UK holding Facebook and others accountable. This really was creating the wake of the Cambridge Analytica scandal. And the platforms at the time were really holding firm to this idea of Section 230 and avoiding host liability and saying, oh, we couldn't possibly be responsible for everything on our platform. And there was one problem with that argument, which is they completely acknowledged the need for them to take action when it came to child pornography. And so they said, yeah, well, you know, no liability for us. But of course, there can be liability on this one specific piece of content and we'll take action on this one specific piece of content. And it always struck me from there on out. I mean, there's no real intellectual consistency here. It's more just what should be in that category of things that they should take responsibility for. And obviously harmful content like that should be – that's an obvious first step but obvious for everyone. But there are other categories. Fraud is another one. When they're making so much money, when they are investing so much money in AI, when they're ignoring privacy protections and everything else throughout the years, I mean, we can't leave it up to them. And setting a clear set of rules to say this is what you're responsible for and expanding that responsibility seems to make a good amount of sense.Taylor Owen30:18-30:28It does, although I think those responsibilities need to be different for different kinds of harms. Because there are different speech implications and apocratic implications of sort of absolute solutions to different kinds of content.Taylor30:28-30:30So like child pornography is a great example.Taylor Owen30:30-31:44In the Online Harms Bill Act, for almost every type of content, it was that risk assessment model. But there was a carve out for child sexual abuse material. So including child pornography. And for intimate images and videos shared without consent. It said the platforms actually have a different obligation, and that's to take it down within 24 hours. And the reason you can do it with those two kinds of content is because if we, one, the AI is actually pretty good at spotting it. It might surprise you, but there's a lot of naked images on the internet that we can train AI with. So we're actually pretty good at using AI to pull this stuff down. But the bigger one is that we are, I think, as a society, it's okay to be wrong in the gray area of that speech, right? Like if something is like debatable, whether it's child pornography, I'm actually okay with us suppressing the speech of the person who sits in that gray area. Whereas for something like hate speech, it's a really different story, right? Like we do not want to suppress and over index for that gray area on hate speech because that's going to capture a lot of reasonable debate that we probably want.Nate Erskine-Smith31:44-31:55Yeah, I think soliciting investment via fraud probably falls more in line with the child pornography category where it's, you know, very obviously illegal.Taylor Owen31:55-32:02And that mechanism is like a takedown mechanism, right? Like if we see fraud, if we know it's fraud, then you take it down, right? Some of these other things we have to go with.Nate Erskine-Smith32:02-32:24I mean, my last question really is you pull the threads together. You've got these different pieces that were introduced in the past. And you've got a government that lots of similar folks around the table, but a new government and a new prime minister certainly with a vision for getting the most out of AI when it comes to our economy.Taylor32:24-32:25Absolutely.Nate Erskine-Smith32:25-33:04You have, for the first time in this country, an AI minister, a junior minister to industry, but still a specific title portfolio and with his own deputy minister and really wants to be seized with this. And in a way, I think that from every conversation I've had with him that wants to maximize productivity in this country using AI, but is also cognizant of the risks and wants to address AI safety. So where from here? You know, you've talked in the past about sort of a grander sort of tech accountability and sovereignty act. Do we do piecemeal, you know, a privacy bill here and an AI safety bill and an online harms bill and we have disparate pieces? What's the answer here?Taylor Owen33:05-34:14I mean, I don't have the exact answer. But I think there's some like, there's some lessons from the past that we can, this government could take. And one is piecemeal bills that aren't centrally coordinated or have no sort of connectivity between them end up with piecemeal solutions that are imperfect and like would benefit from some cohesiveness between them, right? So when the previous government released ADA, the AI Act, it was like really intention in some real ways with the online harms approach. So two different departments issuing two similar bills on two separate technologies, not really talking to each other as far as I can tell from the outside, right? So like we need a coordinating, coordinated, comprehensive effort to digital governance. Like that's point one and we've never had it in this country. And when I saw the announcement of an AI minister, my mind went first to that he or that office could be that role. Like you could – because AI is – it's cross-cutting, right? Like every department in our federal government touches AI in one way or another. And the governance of AI and the adoption on the other side of AI by society is going to affect every department and every bill we need.Nate Erskine-Smith34:14-34:35So if Evan pulled in the privacy pieces that would help us catch up to GDPR. Which it sounds like they will, right? Some version of C27 will probably come back. If he pulls in the online harms pieces that aren't related to the criminal code and drops those provisions, says, you know, Sean Frazier, you can deal with this if you like. But these are the pieces I'm holding on to.Taylor Owen34:35-34:37With a frame of consumer safety, right?Nate Erskine-Smith34:37-34:37Exactly.Taylor Owen34:38-34:39If he wants...Nate Erskine-Smith34:39-34:54Which is connected to privacy as well, right? Like these are all... So then you have thematically a bill that makes sense. And then you can pull in as well the AI safety piece. And then it becomes a consumer protection bill when it comes to living our lives online. Yeah.Taylor Owen34:54-36:06And I think there's an argument whether that should be one bill or whether it's multiple ones. I actually don't think it... I think there's cases for both, right? There's concern about big omnibus bills that do too many things and too many committees reviewing them and whatever. that's sort of a machinery of government question right but but the principle that these should be tied together in a narrative that the government is explicit about making and communicating to publics right that if if you we know that 85 percent of canadians want ai to be regulated what do they mean what they mean is at the same time as they're being told by our government by companies that they should be using and embracing this powerful technology in their lives they're also seeing some risks. They're seeing risks to their kids. They're being told their jobs might disappear and might take their... Why should I use this thing? When I'm seeing some harms, I don't see you guys doing anything about these harms. And I'm seeing some potential real downside for me personally and my family. So even in the adoption frame, I think thinking about data privacy, safety, consumer safety, I think to me, that's the real frame here. It's like citizen safety, consumer safety using these products. Yeah, politically, I just, I mean, that is what it is. It makes sense to me.Nate Erskine-Smith36:06-36:25Right, I agree. And really lean into child safety at the same time. Because like I've got a nine-year-old and a five-year-old. They are growing up with the internet. And I do not want to have to police every single platform that they use. I do not want to have to log in and go, these are the default settings on the parental controls.Taylor36:25-36:28I want to turn to government and go, do your damn job.Taylor Owen36:28-36:48Or just like make them slightly safer. I know these are going to be imperfect. I have a 12-year-old. He spends a lot of time on YouTube. I know that's going to always be a place with sort of content that I would prefer he doesn't see. But I would just like some basic safety standards on that thing. So he's not seeing the worst of the worst.Nate Erskine-Smith36:48-36:58And we should expect that. Certainly at YouTube with its promotion engine, the recommendation function is not actively promoting terrible content to your 12 year old.Taylor Owen36:59-37:31Yeah. That's like de minimis. Can we just torque this a little bit, right? So like maybe he's not seeing content about horrible content about Charlie Kirk when he's a 12 year old on YouTube, right? Like, can we just do something? And I think that's a reasonable expectation as a citizen. But it requires governance. That will not – and that's – it's worth putting a real emphasis on that is one thing we've learned in this moment of repeated deja vus going back 20 years really since our experience with social media for sure through to now is that these companies don't self-govern.Taylor37:31-37:31Right.Taylor Owen37:32-37:39Like we just – we know that indisputably. So to think that AI is going to be different is delusional. No, it'll be pseudo-profit, not the public interest.Taylor37:39-37:44Of course. Because that's what we are. These are the largest companies in the world. Yeah, exactly. And AI companies are even bigger than the last generation, right?Taylor Owen37:44-38:00We're creating something new with the scale of these companies. And to think that their commercial incentives and their broader long-term goals of around AI are not going to override these safety concerns is just naive in the nth degree.Nate Erskine-Smith38:00-38:38But I think you make the right point, and it's useful to close on this, that these goals of realizing the productivity possibilities and potentials of AI alongside AI safety, these are not mutually exclusive or oppositional goals. that it's you create a sandbox to play in and companies will be more successful. And if you have certainty in regulations, companies will be more successful. And if people feel safe using these tools and having certainly, you know, if I feel safe with my kids learning these tools growing up in their classrooms and everything else, you're going to adoption rates will soar. Absolutely. And then we'll benefit.Taylor Owen38:38-38:43They work in tandem, right? And I think you can't have one without the other fundamentally.Nate Erskine-Smith38:45-38:49Well, I hope I don't invite you back five years from now when we have the same conversation.Taylor Owen38:49-38:58Well, I hope you invite me back in five years, but I hope it's like thinking back on all the legislative successes of the previous five years. I mean, that'll be the moment.Taylor38:58-38:59Sounds good. Thanks, David. Thanks. This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit www.uncommons.ca

Security Conversations
Zero-day reality check: iOS exploits, MAPP in China and the hack-back temptation

Security Conversations

Play Episode Listen Later Aug 22, 2025 152:15


Three Buddy Problem - Episode 59: Apple drops another emergency iOS patch and we unpack what that “may have been exploited” language really means: zero-click chains, why notifications help but forensics don't, and the uncomfortable truth that Lockdown Mode is increasingly the default for high-risk users. We connect the dots from ImageIO bugs to geopolitics, discuss who's likely using these exploits, why Apple's guidance stops short, and the practical playbook (ADP on, reboot often, reduce attack surface) that actually works. Plus, we debate Microsoft throttling MAPP access for Chinese vendors, the idea of “letters of marque” for cyber (outsourced offense: smart deterrent or Pandora's box?), and dissect two case studies that blur APT and crimeware: PipeMagic's CLFS zero-day and Russia-linked “Static Tundra” riding seven-year-old Cisco bugs. Cast: Juan Andres Guerrero-Saade (https://twitter.com/juanandres_gs), Ryan Naraine (https://twitter.com/ryanaraine) and Costin Raiu (https://twitter.com/craiu).

The Naked Emperor
Introducing: Click Here | Citizen Lab is still chasing shadows

The Naked Emperor

Play Episode Listen Later Jul 15, 2025 22:51


Today we're sharing an episode from the Click Here podcast from Recorded Future News and PRX.The early Internet was all about hope and utopian possibilities. But the founder of the Citizen Lab, Ron Deibert, always had an unsettled feeling about the web and its dark underbelly. So he created a team of digital sleuths to investigate.More episodes of Click here are available at: https://podcasts.apple.com/us/podcast/click-here/id1225077306

Black Hills Information Security
Denmark is Done with Teams! - 2025-06-16

Black Hills Information Security

Play Episode Listen Later Jun 18, 2025 56:19


Register for Free, Live webcasts & summits:https://poweredbybhis.com00:00 - PreShow Banter™ — Government Linux04:16 - Denmark is Done with Teams! - Talkin' Bout [infosec] News 2025-06-1605:02 - Story # 1: ‘We're done with Teams': German state hits uninstall on Microsoft17:34 - Story # 1b: Denmark Wants to Dump Microsoft Software for Linux, LibreOffice18:14 - Story # 2: Zero-click AI data leak flaw uncovered in Microsoft 365 Copilot25:50 - Story # 3: Fog ransomware attacks use employee monitoring tool to break into business networks30:25 - Story # 4: Expired Discord Invites Hijacked for Stealthy Malware Attacks34:00 - Story # 5: SmartAttack uses smartwatches to steal data from air-gapped systems40:25 - Story # 6: Mirai Botnets Exploiting Wazuh Security Platform Vulnerability44:47 - Story # 7: Google Cloud and Cloudflare hit by widespread service outages48:04 - Story # 8: UNFI cyberattack shuts down network and leaves Whole Foods and others in limbo50:34 - Story # 9: New SharePoint Phishing Attacks Using Lick Deceptive Techniques51:08 - Story # 10: US-backed Israeli company's spyware used to target European journalists, Citizen Lab finds53:32 - Story # 11: Five Zero-Days, 15 Misconfigurations Found in Salesforce Industry Cloud

Doppio Click
Doppio Click di lunedì 16/06/2025

Doppio Click

Play Episode Listen Later Jun 16, 2025 21:58


I limiti pratici dell'ipotesi di arrivare a una forma di voto digitale in remoto, le novità sul caso Paragon emerse dopo la pubblicazione dell'analisi forense di The Citizen Lab e i possibili retroscena dell'attivazione di Starlink in Iran nel pieno del conflitto con Israele. A cura di Marco Schiaffino.

Security Conversations
Cyber flashpoints in Israel-Iran war, the 'magnet of threats', Mossad drone swarms

Security Conversations

Play Episode Listen Later Jun 13, 2025 111:48


Three Buddy Problem - Episode 50: This week, we dissect cyber flashpoints in the Iran-Israel war, revisit the “magnet of threats” server in Iran that attracted APTs from multiple nation-states, and react to Israel's Mossad sneaking explosive drone swarms deep into Iran to support airstrikes. Plus, Stealth Falcon's new WebDAV zero-day, SentinelOne's brush with Chinese APTs, Citizen Lab's forensic takedown of Paragon's iPhone spyware, and the sneaky Meta/Yandex trick that links Android web browsing to app IDs. Cast: Juan Andres Guerrero-Saade (https://twitter.com/juanandres_gs), Ryan Naraine (https://twitter.com/ryanaraine) and Costin Raiu (https://twitter.com/craiu).

Ideas from CBC Radio (Highlights)
How spyware abusers can easily hack your phone and surveil you

Ideas from CBC Radio (Highlights)

Play Episode Listen Later Apr 15, 2025 54:08


We are all vulnerable to digital surveillance, as there's little protection to prevent our phones from getting hacked. Mercenary spyware products like Pegasus are powerful and sophisticated, marketed to government clients around the world. Cybersecurity expert Ron Deibert tells IDEAS, "the latest versions can be implanted on anyone's device anywhere in the world and as we speak, there is literally no defence against it.” Deibert is the founder of the Citizen Lab at the University of Toronto, a group of tech-savvy researchers who dig into the internet, looking for the bad actors in the marketplace for high-tech surveillance and disinformation. In his new book, Chasing Shadows, he shares notorious cases he and his colleagues have worked on and reveals the dark underworld of digital espionage and subversion.

The Agenda with Steve Paikin (Audio)
How Cyber Espionage Threatens Democracy in the Era of Trump

The Agenda with Steve Paikin (Audio)

Play Episode Listen Later Mar 26, 2025 24:20


Since 2001, Ron Deibert and his team at the University of Toronto's Citizen Lab have uncovered dozens of covert spy operations around the world, including the creators of the phone hacking spyware, Pegasus, created by the Israeli company, NSO group, whose clients include Saudi Arabia's Mohammed bin Salman, and Rwanda's Paul Kagame. In a wide-ranging discussion, Deibert tells host Steve Paikin about his recent trip to the White House, the impact that the Trump administration's policies will have on cyber security worldwide, and why Canadians ought to be concerned by a bilateral agreement with the U.S. called the Cloud Act. His new book is called, "Chasing Shadows: Cyber Espionage, Subversion, and the Global Fight for Democracy." See omnystudio.com/listener for privacy information.

Security Conversations
China exposing Taiwan hacks, Paragon spyware and WhatsApp exploits, CISA budget cuts

Security Conversations

Play Episode Listen Later Mar 21, 2025 116:22


Three Buddy Problem - Episode 39: Luta Security CEO Katie Moussouris joins the buddies to parse news around a coordinated Chinese exposure of Taiwan APT actors, CitizenLab's report on Paragon spyware and WhatsApp exploits, an “official” Russian government exploit-buying operation shopping for Telegram exploits, the fragmentation of exploit markets and the future of CISA in the face of budget cuts and layoffs. Cast: Katie Moussouris (https://lutasecurity.com), Juan Andres Guerrero-Saade (https://twitter.com/juanandres_gs), Costin Raiu (https://twitter.com/craiu) and Ryan Naraine (https://twitter.com/ryanaraine).

The Daily Crunch – Spoken Edition
Researchers name several countries as potential Paragon spyware customers

The Daily Crunch – Spoken Edition

Play Episode Listen Later Mar 20, 2025 6:58


The Citizen Lab said it believes several governments may be customers of spyware maker Paragon Solutions. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Security Conversations
North Korea's biggest ever crypto heist: $1.4B stolen from Bybit

Security Conversations

Play Episode Listen Later Feb 23, 2025 127:07


Three Buddy Problem - Episode 35: Juanito is live from DistrictCon with notes on discussion of an elusive iOS zero-day by a company called QuaDream and Apple's controversial removal of iCloud backup end-to-end encryption in the UK. We also cover a staggering $1.4 billion hack by the Lazarus Group against Bybit, new angles in NSA-linked cyber-espionage against China's top universities, Chinese hacking gangs moonlighting as ransomware criminals, and Russian APTs abusing Signal's “linked devices” feature. Plus, Costin explains Microsoft's quantum computing breakthrough. Cast: Juan Andres Guerrero-Saade (https://twitter.com/juanandres_gs), Costin Raiu (https://twitter.com/craiu) and Ryan Naraine (https://twitter.com/ryanaraine).

The Sunday Magazine
Inside the shadowy cyber espionage world that's threatening democracies

The Sunday Magazine

Play Episode Listen Later Feb 12, 2025 24:47


As founder and director of the University of Toronto's Citizen Lab, Ron Deibert has spent his career tracking down and uncovering some of the world's most clandestine cyber espionage operations. Now, the cybersecurity expert is pulling back the curtain on this shadowy world in his new book, Chasing Shadows. Deibert tells David Common how our democracies have become vulnerable to these threats as we become more reliant on technology – and what we need to do to protect them.

A Little More Conversation with Ben O’Hara-Byrne
Ron Deibert tackles cyber espionage, subversion, and the global fight for democracy in new book

A Little More Conversation with Ben O’Hara-Byrne

Play Episode Listen Later Feb 12, 2025 33:58


Guest: Ron Deibert, founder, Citizen Lab at the University of Toronto and author of Chasing Shadows.

Big Tech
New Spyware Has Made Your Phone Less Secure Than You Might Think

Big Tech

Play Episode Listen Later Feb 11, 2025 36:19


It's become pretty easy to spot phishing scams: UPS orders you never made, banking alerts from companies you don't bank with, phone calls from unfamiliar area codes. But over the past decade, these scams – and the technology behind them – have become more sophisticated, invasive and sinister, largely due to the rise of something called ‘mercenary spyware.'The most potent version of this tech is Pegasus, a surveillance tool developed by an Israeli company called NSO Group. Once Pegasus infects your phone, it can see your texts, track your movement, and download your passwords – all without you realizing you'd been hacked.We know a lot of this because of Ron Deibert. Twenty years ago, he founded Citizen Lab, a research group at the University of Toronto that has helped expose some of the most high profile cases of cyber espionage around the world.Ron has a new book out called Chasing Shadows: Cyber Espionage, Subversion, and the Global Fight for Democracy, and he sat down with me to explain how spyware works, and what it means for our privacy – and our democracy.Note: We reached out to NSO Group about the claims made in this episode and they did not reply to our request for comment.Mentioned:“Chasing Shadows: Cyber Espionage, Subversion, and the Global Fight for Democracy,” by Ron Deibert“Meta's WhatsApp says spyware company Paragon targeted users in two dozen countries,” by Raphael Satter, ReutersFurther Reading:“The Autocrat in Your iPhone,” by Ron Deibert“A Comprehensive Analysis of Pegasus Spyware and Its Implications for Digital Privacy and Security,” Karwan Kareem“Stopping the Press: New York Times Journalist Targeted by Saudi-linked Pegasus Spyware Operator,” by Bill Marczak, Siena Anstis, Masashi Crete-Nishihata, John Scott-Railton, and Ron Deibert

The Sunday Magazine
Trump's whirlwind week, Cyber espionage and democracy, Bill Gates, Canada's economic strategy

The Sunday Magazine

Play Episode Listen Later Feb 9, 2025 100:49


Guest host David Common speaks with The Washington Post's Toluse "Tolu" Olorunnipa and Semafor's Kadia Goba about Donald Trump's flurry of controversial actions this past week, The Citizen Lab's Ron Deibert takes us inside the shadowy world of cyber espionage, Microsoft co-founder Bill Gates reflects on how his early years shaped him, and political economist Mark Manger and Mathew Holmes from the Canadian Chamber of Commerce consider strategies to bolster Canada's economy amid Trump's tariff pause.Discover more at https://www.cbc.ca/sunday

The Herle Burly
Cybersecurity and Cyber Espionage with Ron Deibert

The Herle Burly

Play Episode Listen Later Feb 9, 2025 64:36


The Herle Burly was created by Air Quotes Media with support from our presenting sponsor TELUS, as well as CN Rail, and TikTok Canada.Greetings, you ever-curious Herle Burly-ites. I'm going to get right to it today, because we've got a topic that's both pervasive and invasive, and I've wanted to explore it for a while... It's Cybersecurity and Cyber Espionage Day on the podcast!With me is a guest whose CV in the field is as long and tall as a grain silo in my hometown of Prelate, Saskatchewan. Ron Deibert is here.He is a professor of political science the founder and director of the Citizen Lab – the world's foremost digital watchdog – at the Munk School of Global Affairs & Public Policy, University of Toronto. The Citizen Lab focuses on research, development, as well as strategic policy and legal engagement at the intersection of information and communication technologies, human rights, and global security.Ron's been a principal investigator and contributing author to more than 160 reports covering cyber espionage, commercial spyware, Internet censorship and human rights. Those reports have over 25 front page exclusives in the New York Times, Washington Post, Financial Times and other media outlets, and have been cited by policymakers and academics. And his brand-new book – “Chasing Shadows” – tells the story of Citizen Lab and the dozens of cyber espionage cases it's exposed.So, we're going to find out more about Ron today, his backstory and what led him to the field. We'll dive into some of the cases he's been involved with. How governments and bad actors use these surveillance techniques, via our own computers and smartphones. And ask the question: how the hell can we be protected from all of this?Thank you for joining us on #TheHerleBurly podcast. Please take a moment to give us a rating and review on iTunes, Spotify, or your favourite podcast app.Watch episodes of The Herle Burly via Air Quotes Media on YouTube.

PRI: Science, Tech & Environment
WhatsApp identifies dozens of users hacked by Paragon spyware company

PRI: Science, Tech & Environment

Play Episode Listen Later Feb 6, 2025


WhatsApp, used by millions of people around the world, says its users were hacked by the Paragon Solutions spyware company. The World's Host Marco Werman speaks with John Scott-Railton, a senior researcher at Citizen Lab, about the continuing threat of sophisticated spyware. The post WhatsApp identifies dozens of users hacked by Paragon spyware company appeared first on The World from PRX.

PRI: Science, Tech & Environment
WhatsApp identifies dozens of users hacked by Paragon spyware company

PRI: Science, Tech & Environment

Play Episode Listen Later Feb 6, 2025


WhatsApp, used by millions of people around the world, says its users were hacked by the Paragon Solutions spyware company. The World's Host Marco Werman speaks with John Scott-Railton, a senior researcher at Citizen Lab, about the continuing threat of sophisticated spyware. The post WhatsApp identifies dozens of users hacked by Paragon spyware company appeared first on The World from PRX.

SpyTalk
A “Bonanza" of Spyware Abuses

SpyTalk

Play Episode Listen Later Jan 31, 2025 49:45


Ronald Deibert of Citizen Lab tells host Michael Isikoff how his research firm has uncovered the shocking abuse of commercial spyware by foreign governments and spy agencies around the world, resulting in a proliferation of "Watergate-like” scandals.Follow our guest:Ron Deiberthttps://x.com/RonDeibertChasing Shadowshttps://www.simonandschuster.ca/books/Chasing-Shadows/The Citizen Labhttps://citizenlab.ca/https://x.com/citizenlab Follow Jeff Stein on Twitter:https://twitter.com/SpyTalkerFollow Michael Isikoff on Twitter:https://twitter.com/isikoff Follow SpyTalk on Twitter:https://twitter.com/talk_spySubscribe to SpyTalk on Substackhttps://www.spytalk.co/Take our listener survey where you can give us feedback.http://survey.podtrac.com/start-survey.aspx?pubid=BffJOlI7qQcF&ver=short

Beyond the Headlines
American Fascism and the Tech Barons of Authoritarianism: How Silicon Valley Enables Trump's Information War

Beyond the Headlines

Play Episode Listen Later Jan 27, 2025 59:00


Tech oligarchs have risen to dominate global politics and public discourse, posing grave threats to democracy and governance. Under Donald Trump's presidency, the consolidation of power among Silicon Valley elites has exacerbated critical challenges, including the spread of misinformation, the weaponization of social media, and the unchecked development of artificial intelligence. These forces have not only deepened political polarization but also paved the way for the normalization of extremism, undermining the foundations of truth in the digital era. The intersection of technological exploitation, political radicalization, and the information war presents urgent questions for the future of democratic societies. In this episode of Beyond the Headlines, we unpack these critical dynamics with two distinguished guests. Andres Kasekamp, an expert on populist radical right movements and European governance, explores the historical and political parallels of authoritarian trends. Ron Deibert, a global authority on cybersecurity and digital rights, highlights the ways in which tech platforms enable political manipulation and disinformation campaigns. Together, they offer in-depth insights into the complex role of digital platforms in amplifying authoritarianism and discuss potential pathways for mitigating their impact on democracy. Andres Kasekamp is the Elmar Tampõld Chair of Estonian Studies and Professor of History at the University of Toronto's Department of History and the Munk School of Global Affairs and Public Policy. He is a leading scholar on Baltic politics, memory politics, and populist radical right movements. Formerly a Professor of Baltic Politics at the University of Tartu and Director of the Estonian Foreign Policy Institute in Tallinn, Kasekamp has held visiting positions at esteemed institutions such as Humboldt University in Berlin and the Norwegian Institute for International Affairs. Among his acclaimed works is A History of the Baltic States, which has been translated into multiple languages and remains a definitive text in the field. His research explores European foreign and security policy and the intricate dynamics of cooperation and conflict in the Baltic Sea region. Currently, he is editing The Oxford Handbook of Modern Baltic History. Ron Deibert is a Professor of Political Science and the Director of the Citizen Lab at the Munk School of Global Affairs & Public Policy, University of Toronto. A pioneer in cybersecurity and human rights, Deibert has led the Citizen Lab's groundbreaking investigations into cyber espionage, commercial spyware, and digital censorship, producing over 120 influential reports. These include the Tracking Ghostnet investigation into cyber-espionage and the Reckless series, which revealed spyware abuses targeting journalists and activists. Deibert is also the author of RESET: Reclaiming the Internet for Civil Society, a winner of the Shaughnessy Cohen Prize for Political Writing. His work has earned numerous accolades, including the Electronic Frontier Foundation Pioneer Award and the Order of Ontario. Beyond academia, he serves on advisory boards for organizations like Amnesty International and PEN Canada, making him a critical voice in addressing the intersection of technology, democracy, and civil liberties. Produced by: Julia Brahy

Security Conversations
Inside the Turla Playbook: Hijacking APTs and fourth-party espionage

Security Conversations

Play Episode Listen Later Dec 7, 2024 107:08


Three Buddy Problem - Episode 24: In this episode, we did into Lumen/Microsoft's revelations on Russia's Turla APT stealing from a Pakistani APT, and issues around fourth-party espionage and problems with threat actor attribution. We also discuss Citizen Lab's findings on Monokle-like spyware implanted by Russian authorities, the slow pace of Salt Typhoon disinfection, the Solana web3.js supply chain attack affecting crypto projects, and the Romanian election crisis over Russian interference via TikTok. Cast: Juan Andres Guerrero-Saade (https://twitter.com/juanandres_gs), Costin Raiu (https://twitter.com/craiu)and Ryan Naraine (https://twitter.com/ryanaraine).

La W Radio con Julio Sánchez Cristo
“Es como tener un espía en el bolsillo”: así funciona ‘Pegasus', según Citizen Lab

La W Radio con Julio Sánchez Cristo

Play Episode Listen Later Sep 27, 2024 25:16


Ronald Deibert, director de la organización Citizen Lab de la Universidad de Toronto, conversó con La W acerca del software espía ‘Pegasus'.

The Decibel
University of Toronto lab unmasks Russian hacking campaign

The Decibel

Play Episode Listen Later Aug 20, 2024 17:35


By now, most people know how to recognize the signs of a phishing e-mail – poor spelling and grammar, strange sender e-mail addresses, and of course, an instruction to click on a link, where you're asked to put in your banking or login credentials. But these scams are becoming more sophisticated and politically motivated.Last week, Citizen Lab at the University of Toronto uncovered what they're calling the River of Phish campaign, which uses sophisticated social engineering practices to target people, including a former U.S. ambassador to Ukraine. The Globe's telecom reporter Alexandra Posadzki is on the show to talk about what Citizen Lab found, how the scheme works, and what we know about the Russia-linked group behind it.Questions? Comments? Ideas? Email us at thedecibel@globeandmail.com

Decipher Security Podcast
Rebekah Brown and John Scott-Railton on COLDRIVER and Russian Cyberespionage

Decipher Security Podcast

Play Episode Listen Later Aug 19, 2024 23:12


Rebekah Brown and John Scott-Railton of the Citizen Lab join Dennis Fisher to dive into their group's new report on highly targeted spear phishing campaigns by the Russian threat actor COLDRIVER and then discuss the emergence of a new, possibly related group called COLDWASTREL. 

Trouble with the Truth
Pegasus strikes again: how Russian and Belarusian independent journalists became new targets of the hacking software 

Trouble with the Truth

Play Episode Listen Later Jun 28, 2024 32:46


On 30 May, a new report produced by the digital rights organisation Access Now and Citizen Lab revealed the details of the latest Pegasus attack on Russian and Belarusian journalists and activists. Pegasus, a sophisticated spyware made by Israel's NSO Group made headlines in 2021 when it was discovered to have been targeted at thousands of people from all over the globe, including human rights activists and media workers. What makes this spyware so dangerous is that it doesn't require clicking on a link and some victims may never discover that they've been hacked. It can penetrate IOS and Android systems and gain full access to a device- including photos, passwords emails and even microphone. In this episode of Trouble with the Truth, Lana talks to Natalia Krapiva, the Senior Tech-Legal Counsel at Access Now about the latest targets of Pegasus attacks. Among them: the CEO of Novaya Gazeta, Maria Epifanova, journalists Evgeny Pavlov and Evgeny Erlikh, Belarusian activist Andrei Sannikovand, and the editor-in-chief of independent Belarusian media website Charter97.org - Natallia Radzina. They discuss what makes Pegasus so hard to identify and who could be behind it - while Russian and Belarusian authorities are the most obvious suspects, the truth is more complex. Finally, Natalia shares some useful advice on how journalists can protect themselves from spyware and what steps they should take if they discover they've been hacked. Useful resources: Access Now Digital Security helpline: https://www.accessnow.org/help/Citizen Lab Tools & Resources: https://citizenlab.ca/category/research/tools-resources/Justice for Journalists Media Safety Academy: https://jfj.academy/en/

The Shared Security Show
Citizen Lab vs. NSO Group, Apple AI and Privacy

The Shared Security Show

Play Episode Listen Later Jun 17, 2024 17:06


In episode 334, hosts Tom Eston, Scott Wright, and Kevin Johnson discuss two major topics. First, they explore the ongoing legal battle between Citizen Lab and the Israeli spyware company NSO Group. The courts have consistently blocked NSO's attempts to access Citizen Lab's documents to protect victim privacy. Second, they discuss Apple's new AI features […] The post Citizen Lab vs. NSO Group, Apple AI and Privacy appeared first on Shared Security Podcast.

TRENDIFIER with Julian Dorey
[VIDEO] - Gray Hat Hacker EXPOSES How Gov Spyware is BRAINWASHING You | Jonathan Scott • 209

TRENDIFIER with Julian Dorey

Play Episode Listen Later May 30, 2024 173:45


(***TIMESTAMPS in description below) ~Jonathan Scott is a Gray Hat Hacker. He is known for exposing the *real* story behind "Hotel Rwanda" and for his expertise on NSO Group's Mysterious Spyware, "Pegasus." - BUY Guest's Books & Films IN MY AMAZON STORE: https://amzn.to/3RPu952 EPISODE LINKS: - Julian Dorey PODCAST MERCH: https://juliandorey.myshopify.com/ - Support our Show on PATREON: https://www.patreon.com/JulianDorey - Join our DISCORD: https://discord.gg/Ajqn5sN6 JONATHON SCOTT'S LINKS: - JONATHON'S YOUTUBE: https://www.youtube.com/c/jonathandata1 JULIAN YT CHANNELS: - SUBSCRIBE to Julian Dorey Clips YT: https://www.youtube.com/@juliandoreyclips - SUBSCRIBE to Julian Dorey Daily YT: https://www.youtube.com/@JulianDoreyDaily - SUBSCRIBE to Best of JDP: https://www.youtube.com/@bestofJDP ***TIMESTAMPS*** 00:00 - Grey Hat Hacker, Pegasus, Bitcoin Controversy Case

The Lawfare Podcast
Lawfare Archive: Trump Takes Aim at TikTok and WeChat

The Lawfare Podcast

Play Episode Listen Later May 27, 2024 55:25


From August 12, 2020: President Trump recently issued executive orders aimed at banning TikTok and WeChat from operating in the United States. To discuss the sanction, Bobby Chesney sat down with Dr. Sheena Chestnut Greitens, an associate professor at the LBJ School of Public Affairs at the University of Texas at Austin and a faculty affiliate with the Strauss Center for International Security and Law and the Clements Center for National Security at UT; and Dr. Ronald Deibert, a professor of political science and the founder and director of The Citizen Lab at the University of Toronto's Munk School of Global Affairs and Public Policy. In addition to the executive orders concerning TikTok and WeChat, they also discussed the larger U.S.-China relationship and the role of technology competition in that space.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/c/trumptrials.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

Reveal
The Spy Inside Your Smartphone

Reveal

Play Episode Listen Later Apr 27, 2024 49:34


Around the globe, journalists, human rights activists, scholars and others are facing digital attacks from Pegasus, military-grade spyware originally developed to go after criminals. Some of the people targeted have been killed or are in prison.In this episode, Reveal partners with the Shoot the Messenger podcast to investigate one of the biggest Pegasus hacks ever uncovered: the targeting of El Faro newspaper in El Salvador.In the opening story, hosts Rose Reid and Nando Vila speak with El Faro co-founder Carlos Dada and reporter Julia Gavarrete. El Faro has been lauded for its investigations into government corruption and gang violence. The newspaper is no stranger to threats and intimidation, which have increased under the administration of President Nayib Bukele.Reid and Vila also speak with John Scott-Railton of Citizen Lab, a Toronto-based digital watchdog group. Scott-Railton worked to identify the El Faro breach, and it was one of the most obsessive cases of spying Citizen Lab has ever seen.Over the course of one year, 22 members of the newspaper's staff had their phones infected with Pegasus and were surveilled by a remote operator. Researchers suspect Bukele's government was behind the spying, though officials have denied those allegations. The breach forced El Faro's journalists to change the way they work and live and take extreme measures to protect sources and themselves. Then Reid talks with Reveal's Al Letson about growing efforts to hold the NSO Group, the company behind Pegasus, accountable for the massive digital attacks. Support Reveal's journalism at Revealnews.org/donatenow Subscribe to our weekly newsletter to get the scoop on new episodes at Revealnews.org/newsletter Connect with us on Twitter, Facebook and Instagram

Ideas from CBC Radio (Highlights)
Massey at 60: Ron Deibert on how spyware is changing the nature of authority today

Ideas from CBC Radio (Highlights)

Play Episode Listen Later Apr 25, 2024 54:08


Citizen Lab founder and director Ron Deibert reflects on what's changed in the world of spyware, surveillance, and social media since he delivered his 2020 CBC Massey Lectures, Reset: Reclaiming the Internet for Civil Society. *This episode is part of an ongoing series of episodes marking the 60th anniversary of Massey College, a partner in the Massey Lectures.

The Daily Decrypt - Cyber News and Discussions
Keyboard App Vulnerabilities, Ring Privacy Settlement, Cyber Attacker Dwell Time Reduction

The Daily Decrypt - Cyber News and Discussions

Play Episode Listen Later Apr 25, 2024


Explore cybersecurity threats and solutions with experts analyzing critical vulnerabilities in keyboard apps, a $5.6 million privacy breach settlement for Ring users, and the latest trends in cyber attacker dwell times. Gain insights on global security measures and personal privacy protection. Sources: https://citizenlab.ca/2024/04/vulnerabilities-across-keyboard-apps-reveal-keystrokes-to-network-eavesdroppers/ https://www.bleepingcomputer.com/news/security/ring-customers-get-56-million-in-privacy-breach-settlement/ https://www.helpnetsecurity.com/2024/04/24/2023-attacker-dwell-time/ 00:00 Intro 01:03 Deep Dive into Keyboard App Vulnerabilities and User Protection Tips 03:39 Ring's Privacy Breach: Details and Consumer Compensation 06:09 Cybersecurity Wins: Decreased Attacker Dwell Time and Enhanced Defenses 09:53 Conclusion: The Future of Cybersecurity and the Role of Large Language Models Tags: cybersecurity, privacy breach, keyboard apps, encryption, Ring settlement, attacker dwell time, data protection, smart home security Search Phrases: keyboard app security flaws Ring privacy breach settlement details reducing cyber attacker dwell time encryption vulnerabilities in keyboard apps FTC refund to Ring users how to protect against cybersecurity threats latest trends in cybersecurity attacks privacy and security in smart home devices Summarized Transcript: Welcome to the Daily Decrypt, your essential guide to navigating the digital domain. In today's episode, we're uncovering critical vulnerabilities in popular Chinese pinyin keyboard apps, exploring a substantial privacy breach with Ring's camera system, and diving into the global improvements in cybersecurity detection times. Join us as we decode the digital world, keeping your data safe and your curiosity alive. Our journey begins with a startling revelation from Citizen Lab. Over 1 billion users of popular Chinese pinyin keyboard apps are at a crossroads, facing the risk of having their keystrokes decrypted. Among the inspected vendors - Baidu, Honor, Huawei, iFlytec, Oppo, Samsung, Tencent, Vivo, and Xiaomi - most apps remain a breach waiting to happen, with network eavesdroppers able to exploit vulnerabilities passively. How can users shield themselves against such invasive threats? Turning off cloud-based services and opting for a more secure keyboard ecosystem are steps in the right direction. Next, we delve into the breach that shook trust to its core - Ring's privacy debacle. A staggering 5.6 million in refunds are being distributed to affected customers, a move prompted by the Federal Trade Commission after unauthorized access of private video feeds came to light. The case brings to the forefront the critical need for robust security measures in IoT devices, especially those designed for security like cameras. How did Ring respond to the breach, and what can consumers learn from this incident to protect their own digital footprints? On a brighter note, global security saw an inspiring leap forward in 2023. Organizations now detect intrusions in a median of 10 days, a significant improvement from the previous 16 days in 2022. This progress indicates a strengthening in defense mechanisms against cyber threats. But with ransomware and zero-day exploits on the rise, how can organizations maintain this momentum and ensure the safety of our digital realms? Additionally, the emergence of large language models like OpenAI introduces new dynamics in both defense and offense within cybersecurity. These powerful tools aid in the development of new technologies and the fast analysis of vast datasets. However, the unrestricted usage by attackers versus the ethical constraints on defenders presents unique challenges. How will this play out in the evolving cybersecurity landscape? This has been the Daily Decrypt. If today's episode unlocked new perspectives for you, show your support with a rating on Spotify or Apple Podcasts. Follow us on Instagram, or catch our episodes on YouTube for more insights into the cyber world. Until next time, keep your data safe and your curiosity sparked.

The Sunday Show
What Leverage Remains to Preserve Free Expression in Hong Kong?

The Sunday Show

Play Episode Listen Later Feb 29, 2024 45:33


This week, a public consultation period ended for a new Hong Kong national security law, known as Article 23. Article 23 ostensibly targets a wide array of crimes, including treason, theft of state secrets, espionage, sabotage, sedition, and "external interference" from foreign governments. The Hong Kong legislature, dominated by pro-Beijing lawmakers, is expected to approve it, even as its critics argue that the law criminalizes basic human rights, such as the freedom of expression, signaling a further erosion of the liberties once enjoyed by the residents of Hong Kong.To learn more about what is happening in Hong Kong and what role tech firms and other outside voices could be doing to preserve freedoms for the people of Hong Kong, Justin Hendrix spoke to three experts who are following developments there closely:Chung Ching Kwong, senior analyst at the Inter-Parliamentary Alliance on ChinaLokman Tsui, a fellow at Citizen Lab at University of Toronto, andMichael Caster, the Asia Digital Program Manager with Article 19.

Eunoia: Beautiful Thinkers

The sixth episode of Season VI "Saturated" : IU Edition welcomes Jason Q. Ng, Author of Blocked On Weibo where he writes about Chinese internet censorship and currently works as a Data Scientist at Duolingo.In this episode Jason discusses how he has used data throughout his career to democratize information in order to help people learn about everything from incarceration rates in his work as a researcher with Citizen Lab, to helping artists on  connect with listeners on Spotify, to helping people worldwide learn a new language through his work at Duolingo. He talks about the use of propaganda in China to misinform citizens through social media and the complexity of who should be the arbiter of free speech on a global stage. From the moment he started a blog on Chinese censorship that would eventually led to his book, Jason has been moved to help people learn and act on data in a way that positively impacts society. He currently lives and works in New York City with his wife and son. 

Shoot the Messenger: Espionage, Murder & Pegasus Spyware
Exiled Russian Journalist Hacked with Pegasus

Shoot the Messenger: Espionage, Murder & Pegasus Spyware

Play Episode Listen Later Oct 31, 2023 38:10


In this bonus episode of Shoot the Messenger, we share a special interview host Rose Reid did with Russian journalist and founder of the media outlet Meduza, Galina Timchenko. Citizen Lab and Access Now confirmed Galina Timchenko had been infected with Pegasus — which is the first documented case of the use of Pegasus against a Russian journalist. Before Galina Timchenko was the editor in chief of Meduza, she ran one of Russia's most popular media outlets, called Lenta.ru. She was fired as Lenta.ru's chief editor in 2014, after Vladimir Putin returned to power, the same year of Russia's annexation of Crimea. Since 2014, Galina and her team have been reporting on Russia in exile.  We'll launch our second season in early 2024 - which investigates “Who Killed the President of Haiti?”  In the meantime, we will bring you monthly bonus episodes - featuring our favorite shows and updates on Pegasus.

The Lawfare Podcast
Lawfare Archive: Trump Takes Aim at TikTok and WeChat

The Lawfare Podcast

Play Episode Listen Later Oct 9, 2023 55:24


From August 12, 2020: President Trump recently issued executive orders aimed at banning TikTok and WeChat from operating in the United States. To discuss the sanction, Bobby Chesney sat down with Dr. Sheena Chestnut Greitens, an associate professor at the LBJ School of Public Affairs at the University of Texas at Austin and a faculty affiliate with the Strauss Center for International Security and Law and the Clements Center for National Security at UT; and Dr. Ronald Deibert, a professor of political science and the founder and director of The Citizen Lab at the University of Toronto's Munk School of Global Affairs and Public Policy. In addition to the executive orders concerning TikTok and WeChat, they also discussed the larger U.S.-China relationship and the role of technology competition in that space.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

Reveal
The Spy Inside Your Smartphone

Reveal

Play Episode Listen Later Sep 23, 2023 50:47


Around the globe, journalists, human rights activists, scholars and others are facing digital attacks from Pegasus, military-grade spyware originally developed to go after criminals. Some of the people targeted have been killed or are in prison. In this episode, Reveal partners with the Shoot the Messenger podcast to investigate one of the biggest Pegasus hacks ever uncovered: the targeting of El Faro newspaper in El Salvador. In the opening story, hosts Rose Reid and Nando Vila speak with El Faro co-founder Carlos Dada and reporter Julia Gavarrete. El Faro has been lauded for its investigations into government corruption and gang violence. The newspaper is no stranger to threats and intimidation, which have increased under the administration of President Nayib Bukele. Reid and Vila also speak with John Scott-Railton of Citizen Lab, a Toronto-based digital watchdog group. Scott-Railton worked to identify the El Faro breach, and it was one of the most obsessive cases of spying Citizen Lab has ever seen. Over the course of one year, 22 members of the newspaper's staff had their phones infected with Pegasus and were surveilled by a remote operator. Researchers suspect Bukele's government was behind the spying, though officials have denied those allegations. The breach forced El Faro's journalists to change the way they work and live and take extreme measures to protect sources and themselves.  Then Reid talks with Reveal's Al Letson about growing efforts to hold the NSO Group, the company behind Pegasus, accountable for the massive digital attacks. Support Reveal's journalism at Revealnews.org/donatenow Subscribe to our weekly newsletter to get the scoop on new episodes at Revealnews.org/newsletter Connect with us on Twitter, Facebook and Instagram

The Naked Pravda
The Pegasus spyware attack on Meduza

The Naked Pravda

Play Episode Listen Later Sep 16, 2023 48:48


On June 23, 2023, hours before Yevgeny Prigozhin would shock the world by staging a mutiny against the Russian military, Meduza co-founder and CEO Galina Timchenko learned that her iPhone had been infected months earlier with “Pegasus.” The spyware's Israeli designers market the product as a crimefighting super-tool against “terrorists, criminals, and pedophiles,” but states around the world have abused Pegasus to track critics and political adversaries who sometimes end up arrested or even murdered. Access to Pegasus isn't cheap: Researchers believe the service costs tens of millions of dollars, meaning that somebody — some government agency out there — paid maybe a million bucks to hijack Timchenko's smartphone. Why would somebody do that? How would somebody do that? And who could have done it? For answers, The Naked Pravda turned to two experts: Natalia Krapiva, tech-legal counsel for Access Now, a nonprofit organization committed to “defending and extending” the digital civil rights of people worldwide, and John Scott-Railton, a senior researcher at Citizen Lab, an interdisciplinary laboratory at the University of Toronto that investigates digital espionage against civil society. Timestamps for this episode: (3:39) Galina Timchenko's hacked iPhone is the first confirmed case of a Pegasus infection against a Russian journalist (6:16) NSO Group's different contract tiers for Pegasus users (9:59) How aware is NSO Group of Pegasus's rampant misuse? (12:29) Why hasn't Europe done more to restrict the use of such spyware? (15:50) Russian allies using Pegasus (17:58) E.U. members using Pegasus (21:37) Training required to use Pegasus and the spyware's technical side (27:38) The forensics needed to detect a Pegasus infection (35:46) Is Pegasus built more to find criminals or members of civil society? (40:10) Imagining a global moratorium on military-grade spyware (43:22) “A German solution” (45:14) Where the West goes from hereКак поддержать нашу редакцию — даже если вы в России и вам очень страшно

Darknet Diaries
137: Predator

Darknet Diaries

Play Episode Listen Later Sep 5, 2023 69:28 Very Popular


A new type of mercenary spyware came on the radar called Predator. It'll infect a mobile phone, and then suck up all the data from it. Contacts, text messages, location, and more. This malware is being sold to intelligence agencies around the world. In this episode we hear from Crofton Black at Lighthouse Reports who spent 6 months with a team of journalists researching this story which was published here: https://www.lighthousereports.com/investigation/flight-of-the-predator/. We also hear from Bill Marczak and John Scott-Railton from Citizen Lab. If you want to hear about other mercenary spyware, check out episodes 99 and 100, about NSO group and Pegasus. To hear another episode about Greece check out episode 64 called Athens Shadow Games. Sponsors Support for this show comes from Axonius. The Axonius solution correlates asset data from your existing IT and security solutions to provide an always up-to-date inventory of all devices, users, cloud instances, and SaaS apps, so you can easily identify coverage gaps and automate response actions. Axonius gives IT and security teams the confidence to control complexity by mitigating threats, navigating risk, decreasing incidents, and informing business-level strategy — all while eliminating manual, repetitive tasks. Visit axonius.com/darknet to learn more and try it free. Support for this show comes from Varonis. Do you wonder what your company's ransomware blast radius is? Varonis does a free cyber resilience assessment that tells you how many important files a compromised user could steal, whether anything would beep if they did, and a whole lot more. They actually do all the work – show you where your data is too open, if anyone is using it, and what you can lock down before attackers get inside. They also can detect behavior that looks like ransomware and stop it automatically. To learn more visit www.varonis.com/darknet. Support for this show comes from Akamai Connected Cloud (formerly Linode). Akamai Connected Cloud supplies you with virtual servers. Visit linode.com/darknet and get a special offer. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Smashing Security
.ZIP domains, AI lies, and did social media inflame a riot?

Smashing Security

Play Episode Listen Later Jun 1, 2023 76:32


ChatGPT hallucinations cause turbulence in court, a riot in Wales may have been ignited on social media, and do you think .MOV is a good top-level domain for "a website that moves you"?All this and much much more is discussed in the latest edition of the "Smashing Security" podcast by computer security veterans Graham Cluley and Carole Theriault, joined this week by Mark Stockley.Plus don't miss our featured interview with David Ahn of Centripetal.Warning: This podcast may contain nuts, adult themes, and rude language.Episode links:8 new top-level domains for dads, grads and techies - Google.Tweet by Citizen Lab's John Scott-Railton - Twitter.File Archiver in the browser - mr.d0x.A Lawyer's Filing "Is Replete with Citations to Non-Existent Cases" - Thanks, ChatGPT? - Reason.Ely riot: Live updates as police investigate CCTV showing police van following bike moments before fatal crash - Wales Online.Cardiff riot: Police force refers itself to watchdog as CCTV shows its van following e-bike before fatal crash - Sky News.Two boys killed in Cardiff crash which was followed by riot are named - Sky News.Cardiff riots: social media rumours about crash started unrest, says police commissioner - The Guardian.Black Butterflies - Netflix.Black Butterflies trailer - YouTube.“The End of the World Is Just the Beginning: Mapping the Collapse of Globalization” by Peter Zeihan - Amazon.Science Vs - Gimlet Media Podcast.Smashing Security merchandise (t-shirts, mugs, stickers and stuff)Sponsored by:Bitwarden – Password security you can trust. Bitwarden is an open source password manager trusted by millions of individuals, teams, and organizations worldwide for secure password storage and sharing.Kolide – Kolide ensures that if your device isn't secure it can't access your cloud...

Shoot the Messenger: Espionage, Murder & Pegasus Spyware

A special bonus episode from one of our favorite podcasts, Click Here. Click Here is a podcast about the world of cyber and intelligence hosted by Dina Temple-Raston.  Click Here did a special episode about Pegasus spyware in Mexico: Classified documents and internal memos in a new report from digital activists in Mexico make clear the Mexican Army systematically deployed Pegasus spyware against local journalists and activists. R3D, a Mexican digital rights group, and University of Toronto's Citizen Lab, also discovered the existence of a formerly unknown military intelligence unit whose sole purpose appears to be secret surveillance and deployment of spyware.  https://podcasts.apple.com/us/podcast/click-here/id1225077306

Ideas from CBC Radio (Highlights)
Disinformation and Democracy: A Conversation with Maria Ressa and Ron Deibert

Ideas from CBC Radio (Highlights)

Play Episode Listen Later May 10, 2023 54:08


Nobel Peace Prize recipient Maria Ressa believes online disinformation could pose an existential threat to democracy — and she's not alone. Ressa joins Citizen Lab founder Ron Deibert for a conversation about how online impunity is eroding civil society and how we can fight back.

The Cyberlaw Podcast
Does the government need a warrant to warn me about a cyberattack?

The Cyberlaw Podcast

Play Episode Listen Later May 2, 2023 56:11


We open this episode of the Cyberlaw Podcast with some actual news about the debate over renewing section 702 of FISA. That's the law that allows the government to target foreigners for a national security purpose and to intercept their communications in and out of the U.S. A lot of attention has been focused on what happens to those communications after they've been intercepted and stored, and particularly whether the FBI should get a second court authorization—maybe even a warrant based on probable cause—to search for records about an American. Michael J. Ellis reports that the Office of the Director of National Intelligence has released new data on such FBI searches. Turns out, they've dropped from almost 3 million last year to nearly 120 thousand this year. In large part the drop reflects the tougher restrictions imposed by the FBI on such searches. Those restrictions were also made public this week. It has also emerged that the government is using section 702 millions of times a year to identify the victims of cyberattacks (makes sense: foreign hackers are often a national security concern, and their whole business model is to use U.S. infrastructure to communicate [in a very special way] with U.S. networks.) So it turns out that all those civil libertarians who want to make it hard for the government to search 702 for the names of Americans are proposing ways to slow down and complicate the process of warning hacking victims. Thanks a bunch, folks! Justin Sherman covers China's push to attack and even take over enemy (U.S.) satellites. This story is apparently drawn from the Discord leaks, and it has the ring of truth. I opine that the Defense Department has gotten a little too comfortable waging war against people who don't really have an army, and that the Ukraine conflict shows how much tougher things get when there's an organized military on the other side. (Again, credit for our artwork goes to Bing Image Creator.) Adam Candeub flags the next Supreme Court case to nibble away at the problem of social media and the law. We can look forward to an argument next year about the constitutionality of public officials blocking people who post mean comments on the officials' Facebook pages.  Justin and I break down a story about whether Twitter is complying with more government demands under Elon Musk. The short answer is yes. This leads me to ask why we expect social media companies to spend large sums fighting government takedown and surveillance requests when it's much cheaper just to comply. So far, the answer has been that mainstream media and Good People Everywhere will criticize companies that don't fight. But with criticism of Elon Musk's Twitter already turned up to 11, that's not likely to persuade him. Adam and I are impressed by Citizen Labs' report on search censorship in China. We'd both kind of like to see Citizen Lab do the same thing for U.S. censorship, which somehow gets less transparency. If you suspect that's because there's more censorship than U.S. companies want to admit, here's a straw in the wind: Citizen Lab reports that the one American company still providing search services in China, Microsoft Bing, is actually more aggressive about stifling political speech than China's main search engine, Baidu. This fits with my discovery that Bing's Image Creator refused to construct an image using Taiwan's flag. (It was OK using U.S. and German flags, but not China's.) I also credit Microsoft for fixing that particular bit of overreach: You can now create images with both Taiwanese and Chinese flags.  Adam covers the EU's enthusiasm for regulating other countries' companies. It has designated 19 tech giants as subject to its online content rules. Of the 19, one is a European company, and two are Chinese (counting TikTok). The rest are American companies.  I cover a case that I think could be a big problem for the Biden administration as it ramps up its campaign for cybersecurity regulation. Iowa and a couple of other states are suing to block the Environmental Protection Agency's legally questionable effort to impose cybersecurity requirements on public water systems, using an “interpretation” of a law that doesn't say much about cybersecurity into a law that never had it before. Michael Ellis and I cover the story detailing a former NSA director's business ties to Saudi Arabia—and expand it to confess our unease at the number of generals and admirals moving from command of U.S. forces to a consulting gig with the countries they were just negotiating with. Recent restrictions on the revolving door for intelligence officers gets a mention. Adam covers the Quebec decision awarding $500 thousand to a man who couldn't get Google to consistently delete a false story portraying him as a pedophile and conman. Justin and I debate whether Meta's Reels feature has what it takes to be a plausible TikTok competitor? Justin is skeptical. I'm a little less so. Meta's claims about the success of Reels aren't entirely persuasive, but perhaps it's too early to tell. The D.C. Circuit has killed off the state antitrust case trying to undo Meta's long-ago acquisition of WhatsApp and Instagram. The states waited too long, the court held. That doctrine doesn't apply the same way to the Federal Trade Commission (FTC), which will get to pursue a lonely battle against long odds for years. If the FTC is going to keep sending its lawyers into battle like conscripts in Bakhmut, I ask, when will the commission start recruiting in Russian prisons? That was fast. Adam tells us that the Brazil court order banning on Telegram because it wouldn't turn over information on neo-Nazi groups has been overturned on appeal. But Telegram isn't out of the woods. The appeal court left in place fines of $200 thousand a day for noncompliance.    And in another regulatory walkback, Italy's privacy watchdog is letting ChatGPT back into the country. I suspect the Italian government of cutting a deal to save face as it abandons its initial position on ChatGPT's scraping of public data to train the model. Finally, in policies I wish they would walk back, four U.S. regulatory agencies claimed (plausibly) that they had authority to bring bias claims against companies using AI in a discriminatory fashion. Since I don't see any way to bring those claims without arguing that any deviation from proportional representation constitutes discrimination, this feels like a surreptitious introduction of quotas into several new parts of the economy, just as the Supreme Court seems poised to cast doubt on such quotas in higher education.  Download 455th Episode (mp3) You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@gmail.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug! The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

Democracy Paradox
Jamie Susskind Explains How to Use Republican Ideals to Govern Technology

Democracy Paradox

Play Episode Listen Later Apr 25, 2023 48:16 Transcription Available


The problem in both cases is not Zuckerberg or Musk, but the idea of a Zuckerberg or Musk. The idea that, simply by virtue of owning and controlling a particular technology, someone wields arbitrary or unaccountable power which can touch every aspect of our liberty and our democracy.Jamie SusskindAccess Bonus Episodes on PatreonMake a one-time Donation to Democracy Paradox.Jamie Susskind is an author and barrister. He has held fellowships at Cambridge and Harvard Universities. His work is at the crossroads of technology, politics, and law. His most recent book is The Digital Republic: On Freedom and Democracy in the 21st Century.Key HighlightsIntroduction - 0:44Challenges of Digital Technology - 3:18Artificial Intelligence - 20:09A Digital Republic - 40:27Possible Solutions - 43:42Key LinksThe Digital Republic: On Freedom and Democracy in the 21st Century by Jamie SusskindFollow Jamie Susskind on Twitter @jamiesusskindLearn more about Jamie SusskindDemocracy Paradox PodcastSamuel Woolley on Bots, Artificial Intelligence, and Digital PropagandaRonald Deibert from Citizen Lab on Cyber Surveillance, Digital Subversion, and Transnational RepressionMore Episodes from the PodcastMore InformationDemocracy GroupApes of the State created all MusicEmail the show at jkempf@democracyparadox.comFollow on Twitter @DemParadox, Facebook, Instagram @democracyparadoxpodcast100 Books on DemocracyDemocracy Paradox is part of the Amazon Affiliates Program and earns commissions on items purchased from links to the Amazon website. All links are to recommended books discussed in the podcast or referenced in the blog.Support the show

Shoot the Messenger: Espionage, Murder & Pegasus Spyware

Shoot the Messenger: Espionage, Murder and Pegasus Spyware continues with its eighth episode, a special interview with acclaimed journalist Carlos Dada about the intense targeting of him and his newsroom, El Faro, in El Salvador.  El Faro is no stranger to threats and intimidation - which has increased under the administration of President Nayib Bukele. Pegasus was used to spy on Carlos Dada for more than a 100 days in a row. Between June 2020 to November 2021, more than 20 members of El Faro were infected with NSO Group's Pegasus spyware. John Scott-Railton of Citizen Lab worked to identify the El Faro breach — this was one of the most obsessive cases of spying Citizen Lab has ever seen. Shoot the Messenger is hosted by Rose Reid and Nando Vila and is a production of Exile Content Studio. Guests: Carlos Dada and John Scott-Railton

Shoot the Messenger: Espionage, Murder & Pegasus Spyware
7. Pegasus, Netanyahu's Foreign Bargaining Chip

Shoot the Messenger: Espionage, Murder & Pegasus Spyware

Play Episode Listen Later Apr 4, 2023 34:01


Shoot the Messenger: Espionage, Murder and Pegasus Spyware continues with its seventh episode, revealing a pattern of Pegasus as a bargaining chip for foreign relations. Over the past decade, under the leadership of Prime Minister Benjamin Netanyahu - there is a direct correlation between his travels, his meet and greets with world leaders…and the proliferation of Pegasus spyware. Where Netanyahu goes, Pegasus seems to follow. As Netanyahu asserts his control over a divisive Israel, should we expect to see an increase in the scope of NSO Group's capabilities in digital surveillance? This industry has boomed during Netanyahu's tenure - and he has famously said, "Don't over-regulate." Shoot the Messenger is hosted by Rose Reid and Nando Vila and is a production of Exile Content Studio. Guests: Keshet's Amitai Ziv; Financial Times' Mehul Srivastava; Citizen Lab's Scott Stedman

Recorded Future - Inside Threat Intelligence for Cyber Security
57. Enemy of the State (Part 1): Mexico, spyware, and a secret military intelligence unit

Recorded Future - Inside Threat Intelligence for Cyber Security

Play Episode Listen Later Mar 7, 2023 25:31 Very Popular


A new report has published classified documents and internal memos that make clear the Mexican Army bought Pegasus spyware and systematically deployed it against journalists and activists in Mexico. R3D, a Mexican digital rights group, and University of Toronto's Citizen Lab, also found evidence of a formerly unknown military intelligence unit whose sole focus appears to be secret surveillance and deployment of spyware. Some of the sensitive material published in the report came from a massive hack into the Ministry of Defense by the hacktivist group Guacamaya last year. Click Here was part of a small group of journalists given early access to their findings.

Skullduggery
The Autocrat in your Phone (w/ Ron Deibert)

Skullduggery

Play Episode Listen Later Dec 16, 2022 41:18


On this episode of the podcast, we sit down with Ron Deibert, who runs the University of Toronto's Citizen Lab, to discuss the “mercenary spyware” industry - and its proclivity for providing “almost god-like” spyware programs to governments who've been proven to use them to surveil “opposition politicians, human rights activists, journalists, academics, embassy workers, and political dissidents.See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.