POPULARITY
After a hiatus, we've officially restarted the Uncommons podcast, and our first long-form interview is with Professor Taylor Owen to discuss the ever changing landscape of the digital world, the fast emergence of AI and the implications for our kids, consumer safety and our democracy.Taylor Owen's work focuses on the intersection of media, technology and public policy and can be found at taylorowen.com. He is the Beaverbrook Chair in Media, Ethics and Communications and the founding Director of The Centre for Media, Technology and Democracy at McGill University where he is also an Associate Professor. He is the host of the Globe and Mail's Machines Like Us podcast and author of several books.Taylor also joined me for this discussion more than 5 years ago now. And a lot has happened in that time.Upcoming episodes will include guests Tanya Talaga and an episode focused on the border bill C-2, with experts from The Citizen Lab and the Canadian Association of Refugee Lawyers.We'll also be hosting a live event at the Naval Club of Toronto with Catherine McKenna, who will be launching her new book Run Like a Girl. Register for free through Eventbrite. As always, if you have ideas for future guests or topics, email us at info@beynate.ca Chapters:0:29 Setting the Stage1:44 Core Problems & Challenges4:31 Information Ecosystem Crisis10:19 Signals of Reliability & Policy Challenges14:33 Legislative Efforts18:29 Online Harms Act Deep Dive25:31 AI Fraud29:38 Platform Responsibility32:55 Future Policy DirectionFurther Reading and Listening:Public rules for big tech platforms with Taylor Owen — Uncommons Podcast“How the Next Government can Protect Canada's Information Ecosystem.” Taylor Owen with Helen Hayes, The Globe and Mail, April 7, 2025.Machines Like Us PodcastBill C-63Transcript:Nate Erskine-Smith00:00-00:43Welcome to Uncommons, I'm Nate Erskine-Smith. This is our first episode back after a bit of a hiatus, and we are back with a conversation focused on AI safety, digital governance, and all of the challenges with regulating the internet. I'm joined by Professor Taylor Owen. He's an expert in these issues. He's been writing about these issues for many years. I actually had him on this podcast more than five years ago, and he's been a huge part of getting us in Canada to where we are today. And it's up to this government to get us across the finish line, and that's what we talk about. Taylor, thanks for joining me. Thanks for having me. So this feels like deja vu all over again, because I was going back before you arrived this morning and you joined this podcast in April of 2020 to talk about platform governance.Taylor Owen00:43-00:44It's a different world.Taylor00:45-00:45In some ways.Nate Erskine-Smith00:45-01:14Yeah. Well, yeah, a different world for sure in many ways, but also the same challenges in some ways too. Additional challenges, of course. But I feel like in some ways we've come a long way because there's been lots of consultation. There have been some legislative attempts at least, but also we haven't really accomplished the thing. So let's talk about set the stage. Some of the same challenges from five years ago, but some new challenges. What are the challenges? What are the problems we're trying to solve? Yeah, I mean, many of them are the same, right?Taylor Owen01:14-03:06I mean, this is part of the technology moves fast. But when you look at the range of things citizens are concerned about when they and their children and their friends and their families use these sets of digital technologies that shape so much of our lives, many things are the same. So they're worried about safety. They're worried about algorithmic content and how that's feeding into what they believe and what they think. They're worried about polarization. We're worried about the integrity of our democracy and our elections. We're worried about sort of some of the more acute harms of like real risks to safety, right? Like children taking their own lives and violence erupting, political violence emerging. Like these things have always been present as a part of our digital lives. And that's what we were concerned about five years ago, right? When we talked about those harms, that was roughly the list. Now, the technologies we were talking about at the time were largely social media platforms, right? So that was the main way five years ago that we shared, consumed information in our digital politics and our digital public lives. And that is what's changing slightly. Now, those are still prominent, right? We're still on TikTok and Instagram and Facebook to a certain degree. But we do now have a new layer of AI and particularly chatbots. And I think a big question we face in this conversation in this, like, how do we develop policies that maximize the benefits of digital technologies and minimize the harms, which is all this is trying to do. Do we need new tools for AI or some of the things we worked on for so many years to get right, the still the right tools for this new set of technologies with chatbots and various consumer facing AI interfaces?Nate Erskine-Smith03:07-03:55My line in politics has always been, especially around privacy protections, that we are increasingly living our lives online. And especially, you know, my kids are growing up online and our laws need to reflect that reality. All of the challenges you've articulated to varying degrees exist in offline spaces, but can be incredibly hard. The rules we have can be incredibly hard to enforce at a minimum in the online space. And then some rules are not entirely fit for purpose and they need to be updated in the online space. It's interesting. I was reading a recent op-ed of yours, but also some of the research you've done. This really stood out. So you've got the Hogue Commission that says disinformation is the single biggest threat to our democracy. That's worth pausing on.Taylor Owen03:55-04:31Yeah, exactly. Like the commission that spent a year at the request of all political parties in parliament, at the urging of the opposition party, so it spent a year looking at a wide range of threats to our democratic systems that everybody was concerned about originating in foreign countries. And the conclusion of that was that the single biggest threat to our democracy is the way information flows through our society and how we're not governing it. Like that is a remarkable statement and it kind of came and went. And I don't know why we moved off from that so fast.Nate Erskine-Smith04:31-05:17Well, and there's a lot to pull apart there because you've got purposeful, intentional, bad actors, foreign influence operations. But you also have a really core challenge of just the reliability and credibility of the information ecosystem. So you have Facebook, Instagram through Meta block news in Canada. And your research, this was the stat that stood out. Don't want to put you in and say like, what do we do? Okay. So there's, you say 11 million views of news have been lost as a consequence of that blocking. Okay. That's one piece of information people should know. Yeah. But at the same time.Taylor Owen05:17-05:17A day. Yeah.Nate Erskine-Smith05:18-05:18So right.Taylor Owen05:18-05:2711 million views a day. And we should sometimes we go through these things really fast. It's huge. Again, Facebook decides to block news. 40 million people in Canada. Yeah.Taylor05:27-05:29So 11 million times a Canadian.Taylor Owen05:29-05:45And what that means is 11 million times a Canadian would open one of their news feeds and see Canadian journalism is taken out of the ecosystem. And it was replaced by something. People aren't using these tools less. So that journalism was replaced by something else.Taylor05:45-05:45Okay.Taylor Owen05:45-05:46So that's just it.Nate Erskine-Smith05:46-06:04So on the one side, we've got 11 million views a day lost. Yeah. And on the other side, Canadians, the majority of Canadians get their news from social media. But when the Canadians who get their news from social media are asked where they get it from, they still say Instagram and Facebook. But there's no news there. Right.Taylor Owen06:04-06:04They say they get.Nate Erskine-Smith06:04-06:05It doesn't make any sense.Taylor Owen06:06-06:23It doesn't and it does. It's terrible. They ask Canadians, like, where do you get people who use social media to get their news? Where do they get their news? and they still say social media, even though it's not there. Journalism isn't there. Journalism isn't there. And I think one of the explanations— Traditional journalism. There is—Taylor06:23-06:23There is—Taylor Owen06:23-06:47Well, this is what I was going to get at, right? Like, there is—one, I think, conclusion is that people don't equate journalism with news about the world. There's not a one-to-one relationship there. Like, journalism is one provider of news, but so are influencers, so are podcasts, people listening to this. Like this would be labeled probably news in people's.Nate Erskine-Smith06:47-06:48Can't trust the thing we say.Taylor Owen06:48-07:05Right. And like, and neither of us are journalists, right? But we are providing information about the world. And if it shows up in people's feeds, as I'm sure it will, like that probably gets labeled in people's minds as news, right? As opposed to pure entertainment, as entertaining as you are.Nate Erskine-Smith07:05-07:06It's public affairs content.Taylor Owen07:06-07:39Exactly. So that's one thing that's happening. The other is that there's a generation of creators that are stepping into this ecosystem to both fill that void and that can use these tools much more effectively. So in the last election, we found that of all the information consumed about the election, 50% of it was created by creators. 50% of the engagement on the election was from creators. Guess what it was for journalists, for journalism? Like 5%. Well, you're more pessimistic though. I shouldn't have led with the question. 20%.Taylor07:39-07:39Okay.Taylor Owen07:39-07:56So all of journalism combined in the entire country, 20 percent of engagement, influencers, 50 percent in the last election. So like we've shifted, at least on social, the actors and people and institutions that are fostering our public.Nate Erskine-Smith07:56-08:09Is there a middle ground here where you take some people that play an influencer type role but also would consider themselves citizen journalists in a way? How do you – It's a super interesting question, right?Taylor Owen08:09-08:31Like who – when are these people doing journalism? When are they doing acts of journalism? Like someone can be – do journalism and 90% of the time do something else, right? And then like maybe they reveal something or they tell an interesting story that resonates with people or they interview somebody and it's revelatory and it's a journalistic act, right?Taylor08:31-08:34Like this is kind of a journalistic act we're playing here.Taylor Owen08:35-08:49So I don't think – I think these lines are gray. but I mean there's some other underlying things here which like it matters if I think if journalistic institutions go away entirely right like that's probably not a good thing yeah I mean that's whyNate Erskine-Smith08:49-09:30I say it's terrifying is there's a there's a lot of good in the in the digital space that is trying to be there's creative destruction there's a lot of work to provide people a direct sense of news that isn't that filter that people may mistrust in traditional media. Having said that, so many resources and there's so much history to these institutions and there's a real ethics to journalism and journalists take their craft seriously in terms of the pursuit of truth. Absolutely. And losing that access, losing the accessibility to that is devastating for democracy. I think so.Taylor Owen09:30-09:49And I think the bigger frame of that for me is a democracy needs signals of – we need – as citizens in a democracy, we need signals of reliability. Like we need to know broadly, and we're not always going to agree on it, but like what kind of information we can trust and how we evaluate whether we trust it.Nate Erskine-Smith09:49-10:13And that's what – that is really going away. Pause for a sec. So you could imagine signals of reliability is a good phrase. what does it mean for a legislator when it comes to putting a rule in place? Because you could imagine, you could have a Blade Runner kind of rule that says you've got to distinguish between something that is human generatedTaylor10:13-10:14and something that is machine generated.Nate Erskine-Smith10:15-10:26That seems straightforward enough. It's a lot harder if you're trying to distinguish between Taylor, what you're saying is credible, and Nate, what you're saying is not credible,Taylor10:27-10:27which is probably true.Nate Erskine-Smith10:28-10:33But how do you have a signal of reliability in a different kind of content?Taylor Owen10:34-13:12I mean, we're getting into like a journalistic journalism policy here to a certain degree, right? And it's a wicked problem because the primary role of journalism is to hold you personally to account. And you setting rules for what they can and can't do and how they can and can't behave touches on some real like third rails here, right? It's fraught. However, I don't think it should ever be about policy determining what can and can't be said or what is and isn't journalism. The real problem is the distribution mechanism and the incentives within it. So a great example and a horrible example happened last week, right? So Charlie Kirk gets assassinated. I don't know if you opened a feed in the few days after that, but it was a horrendous place, right? Social media was an awful, awful, awful place because what you saw in that feed was the clearest demonstration I've ever seen in a decade of looking at this of how those algorithmic feeds have become radicalized. Like all you saw on every platform was the worst possible representations of every view. Right. Right. It was truly shocking and horrendous. Like people defending the murder and people calling for the murder of leftists and like on both sides. Right. people blaming Israel, people, whatever. Right. And that isn't a function of like- Aaron Charlie Kirk to Jesus. Sure. Like- It was bonkers all the way around. Totally bonkers, right? And that is a function of how those ecosystems are designed and the incentives within them. It's not a function of like there was journalism being produced about that. Like New York Times, citizens were doing good content about what was happening. It was like a moment of uncertainty and journalism was doing or playing a role, but it wasn't And so I think with all of these questions, including the online harms ones, and I think how we step into an AI governance conversation, the focus always has to be on those systems. I'm like, what is who and what and what are the incentives and the technical decisions being made that determine what we experience when we open these products? These are commercial products that we're choosing to consume. And when we open them, a whole host of business and design and technical decisions and human decisions shape the effect it has on us as people, the effect it has on our democracy, the vulnerabilities that exist in our democracy, the way foreign actors or hostile actors can take advantage of them, right? Like all of that stuff we've been talking about, the role reliability of information plays, like these algorithms could be tweaked for reliable versus unreliable content, right? Over time.Taylor13:12-13:15That's not a – instead of reactionary –Taylor Owen13:15-13:42Or like what's most – it gets most engagement or what makes you feel the most angry, which is largely what's driving X, for example, right now, right? You can torque all those things. Now, I don't think we want government telling companies how they have to torque it. But we can slightly tweak the incentives to get better content, more reliable content, less polarizing content, less hateful content, less harmful content, right? Those dials can be incentivized to be turned. And that's where the policy space should play, I think.Nate Erskine-Smith13:43-14:12And your focus on systems and assessing risks with systems. I think that's the right place to play. I mean, we've seen legislative efforts. You've got the three pieces in Canada. You've got online harms. You've got the privacy and very kind of vague initial foray into AI regs, which we can get to. And then a cybersecurity piece. And all of those ultimately died on the order paper. Yeah. We also had the journalistic protection policies, right, that the previous government did.Taylor Owen14:12-14:23I mean – Yeah, yeah, yeah. We can debate their merits. Yeah. But there was considerable effort put into backstopping the institutions of journalism by the – Well, they're twofold, right?Nate Erskine-Smith14:23-14:33There's the tax credit piece, sort of financial support. And then there was the Online News Act. Right. Which was trying to pull some dollars out of the platforms to pay for the news as well. Exactly.Taylor14:33-14:35So the sort of supply and demand side thing, right?Nate Erskine-Smith14:35-14:38There's the digital service tax, which is no longer a thing.Taylor Owen14:40-14:52Although it still is a piece of past legislation. Yeah, yeah, yeah. It still is a thing. Yeah, yeah. Until you guys decide whether to negate the thing you did last year or not, right? Yeah.Nate Erskine-Smith14:52-14:55I don't take full responsibility for that one.Taylor Owen14:55-14:56No, you shouldn't.Nate Erskine-Smith14:58-16:03But other countries have seen more success. Yeah. And so you've got in the UK, in Australia, the EU really has led the way. 2018, the EU passes GDPR, which is a privacy set of rules, which we are still behind seven years later. But you've got in 2022, 2023, you've got Digital Services Act that passes. You've got Digital Markets Act. And as I understand it, and we've had, you know, we've both been involved in international work on this. And we've heard from folks like Francis Hogan and others about the need for risk-based assessments. And you're well down the rabbit hole on this. But isn't it at a high level? You deploy a technology. You've got to identify material risks. You then have to take reasonable measures to mitigate those risks. That's effectively the duty of care built in. And then ideally, you've got the ability for third parties, either civil society or some public office that has the ability to audit whether you have adequately identified and disclosed material risks and whether you have taken reasonable steps to mitigate.Taylor Owen16:04-16:05That's like how I have it in my head.Nate Erskine-Smith16:05-16:06I mean, that's it.Taylor Owen16:08-16:14Write it down. Fill in the legislation. Well, I mean, that process happened. I know. That's right. I know.Nate Erskine-Smith16:14-16:25Exactly. Which people, I want to get to that because C63 gets us a large part of the way there. I think so. And yet has been sort of like cast aside.Taylor Owen16:25-17:39Exactly. Let's touch on that. But I do think what you described as the online harms piece of this governance agenda. When you look at what the EU has done, they have put in place the various building blocks for what a broad digital governance agenda might look like. Because the reality of this space, which we talked about last time, and it's the thing that's infuriating about digital policy, is that you can't do one thing. There's no – digital economy and our digital lives are so vast and the incentives and the effect they have on society is so broad that there's no one solution. So anyone who tells you fix privacy policy and you'll fix all the digital problems we just talked about are full of it. Anyone who says competition policy, like break up the companies, will solve all of these problems. is wrong, right? Anyone who says online harms policy, which we'll talk about, fixes everything is wrong. You have to do all of them. And Europe has, right? They updated their privacy policy. They've been to build a big online harms agenda. They updated their competition regime. And they're also doing some AI policy too, right? So like you need comprehensive approaches, which is not an easy thing to do, right? It means doing three big things all over.Nate Erskine-Smith17:39-17:41Especially minority parlance, short periods of time, legislatively.Taylor Owen17:41-18:20Different countries have taken different pieces of it. Now, on the online harms piece, which is what the previous government took really seriously, and I think it's worth putting a point on that, right, that when we talked last was the beginning of this process. After we spoke, there was a national expert panel. There were 20 consultations. There were four citizens' assemblies. There was a national commission, right? Like a lot of work went into looking at what every other country had done because this is a really wicked, difficult problem and trying to learn from what Europe, Australia and the UK had all done. And we kind of taking the benefit of being late, right? So they were all ahead of us.Taylor18:21-18:25People you work with on that grant committee. We're all quick and do our own consultations.Taylor Owen18:26-19:40Exactly. And like the model that was developed out of that, I think, was the best model of any of those countries. And it's now seen as internationally, interestingly, as the new sort of milestone that everybody else is building on, right? And what it does is it says if you're going to launch a digital product, right, like a consumer-facing product in Canada, you need to assess risk. And you need to assess risk on these broad categories of harms that we have decided as legislators we care about or you've decided as legislators you cared about, right? Child safety, child sexual abuse material, fomenting violence and extremist content, right? Like things that are like broad categories that we've said are we think are harmful to our democracy. All you have to do as a company is a broad assessment of what could go wrong with your product. If you find something could go wrong, so let's say, for example, let's use a tangible example. Let's say you are a social media platform and you are launching a product that's going to be used by kids and it allows adults to contact kids without parental consent or without kids opting into being a friend. What could go wrong with that?Nate Erskine-Smith19:40-19:40Yeah.Taylor19:40-19:43Like what could go wrong? Yeah, a lot could go wrong.Taylor Owen19:43-20:27And maybe strange men will approach teenage girls. Maybe, right? Like if you do a risk assessment, that is something you might find. You would then be obligated to mitigate that risk and show how you've mitigated it, right? Like you put in a policy in place to show how you're mitigating it. And then you have to share data about how these tools are used so that we can monitor, publics and researchers can monitor whether that mitigation strategy worked. That's it. In that case, that feature was launched by Instagram in Canada without any risk assessment, without any safety evaluation. And we know there was like a widespread problem of teenage girls being harassed by strange older men.Taylor20:28-20:29Incredibly creepy.Taylor Owen20:29-20:37A very easy, but not like a super illegal thing, not something that would be caught by the criminal code, but a harm we can all admit is a problem.Taylor20:37-20:41And this kind of mechanism would have just filtered out.Taylor Owen20:41-20:51Default settings, right? And doing thinking a bit before you launch a product in a country about what kind of broad risks might emerge when it's launched and being held accountable to do it for doing that.Nate Erskine-Smith20:52-21:05Yeah, I quite like the we I mean, maybe you've got a better read of this, but in the UK, California has pursued this. I was looking at recently, Elizabeth Denham is now the Jersey Information Commissioner or something like that.Taylor Owen21:05-21:06I know it's just yeah.Nate Erskine-Smith21:07-21:57I don't random. I don't know. But she is a Canadian, for those who don't know Elizabeth Denham. And she was the information commissioner in the UK. And she oversaw the implementation of the first age-appropriate design code. That always struck me as an incredibly useful approach. In that even outside of social media platforms, even outside of AI, take a product like Roblox, where tons of kids use it. And just forcing companies to ensure that the default settings are prioritizing child safety so that you don't put the onus on parents and kids to figure out each of these different games and platforms. In a previous world of consumer protection, offline, it would have been de facto. Of course we've prioritized consumer safety first and foremost. But in the online world, it's like an afterthought.Taylor Owen21:58-24:25Well, when you say consumer safety, it's worth like referring back to what we mean. Like a duty of care can seem like an obscure concept. But your lawyer is a real thing, right? Like you walk into a store. I walk into your office. I have an expectation that the bookshelves aren't going to fall off the wall and kill me, right? And you have to bolt them into the wall because of that, right? Like that is a duty of care that you have for me when I walk into your public space or private space. Like that's all we're talking about here. And the age-appropriate design code, yes, like sort of developed, implemented by a Canadian in the UK. And what it says, it also was embedded in the Online Harms Act, right? If we'd passed that last year, we would be implementing an age-appropriate design code as we speak, right? What that would say is any product that is likely to be used by a kid needs to do a set of additional things, not just these risk assessments, right? But we think like kids don't have the same rights as adults. We have different duties to protect kids as adults, right? So maybe they should do an extra set of things for their digital products. And it includes things like no behavioral targeting, no advertising, no data collection, no sexual adult content, right? Like kind of things that like – Seem obvious. And if you're now a child in the UK and you open – you go on a digital product, you are safer because you have an age-appropriate design code governing your experience online. Canadian kids don't have that because that bill didn't pass, right? So like there's consequences to this stuff. and I get really frustrated now when I see the conversation sort of pivoting to AI for example right like all we're supposed to care about is AI adoption and all the amazing things AI is going to do to transform our world which are probably real right like not discounting its power and just move on from all of these both problems and solutions that have been developed to a set of challenges that both still exist on social platforms like they haven't gone away people are still using these tools and the harms still exist and probably are applicable to this next set of technologies as well. So this moving on from what we've learned and the work that's been done is just to the people working in this space and like the wide stakeholders in this country who care about this stuff and working on it. It just, it feels like you say deja vu at the beginning and it is deja vu, but it's kind of worse, right? Cause it's like deja vu and then ignoring theTaylor24:25-24:29five years of work. Yeah, deja vu if we were doing it again. Right. We're not even, we're not evenTaylor Owen24:29-24:41Well, yeah. I mean, hopefully I actually am not, I'm actually optimistic, I would say that we will, because I actually think of if for a few reasons, like one, citizens want it, right? Like.Nate Erskine-Smith24:41-24:57Yeah, I was surprised on the, so you mentioned there that the rules that we design, the risk assessment framework really applied to social media could equally be applied to deliver AI safety and it could be applied to new technology in a useful way.Taylor Owen24:58-24:58Some elements of it. Exactly.Nate Erskine-Smith24:58-25:25I think AI safety is a broad bucket of things. So let's get to that a little bit because I want to pull the pieces together. So I had a constituent come in the office and he is really like super mad. He's super mad. Why is he mad? Does that happen very often? Do people be mad when they walk into this office? Not as often as you think, to be honest. Not as often as you think. And he's mad because he believes Mark Carney ripped him off.Taylor Owen25:25-25:25Okay.Nate Erskine-Smith25:25-26:36Okay. Yep. He believes Mark Carney ripped him off, not with broken promise in politics, not because he said one thing and is delivering something else, nothing to do with politics. He saw a video online, Mark Carney told him to invest money. He invested money and he's out the 200 bucks or whatever it was. And I was like, how could you possibly have lost money in this way? This is like, this was obviously a scam. Like what, how could you have been deceived? But then I go and I watched the video And it is, okay, I'm not gonna send the 200 bucks and I've grown up with the internet, but I can see how- Absolutely. In the same way, phone scams and Nigerian princes and all of that have their own success rate. I mean, this was a very believable video that was obviously AI generated. So we are going to see rampant fraud. If we aren't already, we are going to see many challenges with respect to AI safety. What over and above the risk assessment piece, what do we do to address these challenges?Taylor Owen26:37-27:04So that is a huge problem, right? Like the AI fraud, AI video fraud is a huge challenge. In the election, when we were monitoring the last election, by far the biggest problem or vulnerability of the election was a AI generated video campaign. that every day would take videos of Polyevs and Carney's speeches from the day before and generate, like morph them into conversations about investment strategies.Taylor27:05-27:07And it was driving people to a crypto scam.Taylor Owen27:08-27:11But it was torquing the political discourse.Taylor27:11-27:11That's what it must have been.Taylor Owen27:12-27:33I mean, there's other cases of this, but that's probably, and it was running rampant on particularly meta platforms. They were flagged. They did nothing about it. There were thousands of these videos circulating throughout the entire election, right? And it's not like the end of the world, right? Like nobody – but it torqued our political debate. It ripped off some people. And these kinds of scams are –Taylor27:33-27:38It's clearly illegal. It's clearly illegal. It probably breaks his election law too, misrepresenting a political figure, right?Taylor Owen27:38-27:54So I think there's probably an Elections Canada response to this that's needed. And it's fraud. And it's fraud, absolutely. So what do you do about that, right? And the head of the Canadian Banking Association said there's like billions of dollars in AI-based fraud in the Canadian economy right now. Right? So it's a big problem.Taylor27:54-27:55Yeah.Taylor Owen27:55-28:46I actually think there's like a very tangible policy solution. You put these consumer-facing AI products into the Online Harms Act framework, right? And then you add fraud and AI scams as a category of harm. And all of a sudden, if you're meta and you are operating in Canada during an election, you'd have to do a risk assessment on like AI fraud potential of your product. Responsibility for your platform. And then it starts to circulate. We would see it. They'd be called out on it. They'd have to take it down. And like that's that, right? Like so that we have mechanisms for dealing with this. But it does mean evolving what we worked on over the past five years, these like only harms risk assessment models and bringing in some of the consumer facing AI, both products and related harms into the framework.Nate Erskine-Smith28:47-30:18To put it a different way, I mean, so this is years ago now that we had this, you know, grand committee in the UK holding Facebook and others accountable. This really was creating the wake of the Cambridge Analytica scandal. And the platforms at the time were really holding firm to this idea of Section 230 and avoiding host liability and saying, oh, we couldn't possibly be responsible for everything on our platform. And there was one problem with that argument, which is they completely acknowledged the need for them to take action when it came to child pornography. And so they said, yeah, well, you know, no liability for us. But of course, there can be liability on this one specific piece of content and we'll take action on this one specific piece of content. And it always struck me from there on out. I mean, there's no real intellectual consistency here. It's more just what should be in that category of things that they should take responsibility for. And obviously harmful content like that should be – that's an obvious first step but obvious for everyone. But there are other categories. Fraud is another one. When they're making so much money, when they are investing so much money in AI, when they're ignoring privacy protections and everything else throughout the years, I mean, we can't leave it up to them. And setting a clear set of rules to say this is what you're responsible for and expanding that responsibility seems to make a good amount of sense.Taylor Owen30:18-30:28It does, although I think those responsibilities need to be different for different kinds of harms. Because there are different speech implications and apocratic implications of sort of absolute solutions to different kinds of content.Taylor30:28-30:30So like child pornography is a great example.Taylor Owen30:30-31:44In the Online Harms Bill Act, for almost every type of content, it was that risk assessment model. But there was a carve out for child sexual abuse material. So including child pornography. And for intimate images and videos shared without consent. It said the platforms actually have a different obligation, and that's to take it down within 24 hours. And the reason you can do it with those two kinds of content is because if we, one, the AI is actually pretty good at spotting it. It might surprise you, but there's a lot of naked images on the internet that we can train AI with. So we're actually pretty good at using AI to pull this stuff down. But the bigger one is that we are, I think, as a society, it's okay to be wrong in the gray area of that speech, right? Like if something is like debatable, whether it's child pornography, I'm actually okay with us suppressing the speech of the person who sits in that gray area. Whereas for something like hate speech, it's a really different story, right? Like we do not want to suppress and over index for that gray area on hate speech because that's going to capture a lot of reasonable debate that we probably want.Nate Erskine-Smith31:44-31:55Yeah, I think soliciting investment via fraud probably falls more in line with the child pornography category where it's, you know, very obviously illegal.Taylor Owen31:55-32:02And that mechanism is like a takedown mechanism, right? Like if we see fraud, if we know it's fraud, then you take it down, right? Some of these other things we have to go with.Nate Erskine-Smith32:02-32:24I mean, my last question really is you pull the threads together. You've got these different pieces that were introduced in the past. And you've got a government that lots of similar folks around the table, but a new government and a new prime minister certainly with a vision for getting the most out of AI when it comes to our economy.Taylor32:24-32:25Absolutely.Nate Erskine-Smith32:25-33:04You have, for the first time in this country, an AI minister, a junior minister to industry, but still a specific title portfolio and with his own deputy minister and really wants to be seized with this. And in a way, I think that from every conversation I've had with him that wants to maximize productivity in this country using AI, but is also cognizant of the risks and wants to address AI safety. So where from here? You know, you've talked in the past about sort of a grander sort of tech accountability and sovereignty act. Do we do piecemeal, you know, a privacy bill here and an AI safety bill and an online harms bill and we have disparate pieces? What's the answer here?Taylor Owen33:05-34:14I mean, I don't have the exact answer. But I think there's some like, there's some lessons from the past that we can, this government could take. And one is piecemeal bills that aren't centrally coordinated or have no sort of connectivity between them end up with piecemeal solutions that are imperfect and like would benefit from some cohesiveness between them, right? So when the previous government released ADA, the AI Act, it was like really intention in some real ways with the online harms approach. So two different departments issuing two similar bills on two separate technologies, not really talking to each other as far as I can tell from the outside, right? So like we need a coordinating, coordinated, comprehensive effort to digital governance. Like that's point one and we've never had it in this country. And when I saw the announcement of an AI minister, my mind went first to that he or that office could be that role. Like you could – because AI is – it's cross-cutting, right? Like every department in our federal government touches AI in one way or another. And the governance of AI and the adoption on the other side of AI by society is going to affect every department and every bill we need.Nate Erskine-Smith34:14-34:35So if Evan pulled in the privacy pieces that would help us catch up to GDPR. Which it sounds like they will, right? Some version of C27 will probably come back. If he pulls in the online harms pieces that aren't related to the criminal code and drops those provisions, says, you know, Sean Frazier, you can deal with this if you like. But these are the pieces I'm holding on to.Taylor Owen34:35-34:37With a frame of consumer safety, right?Nate Erskine-Smith34:37-34:37Exactly.Taylor Owen34:38-34:39If he wants...Nate Erskine-Smith34:39-34:54Which is connected to privacy as well, right? Like these are all... So then you have thematically a bill that makes sense. And then you can pull in as well the AI safety piece. And then it becomes a consumer protection bill when it comes to living our lives online. Yeah.Taylor Owen34:54-36:06And I think there's an argument whether that should be one bill or whether it's multiple ones. I actually don't think it... I think there's cases for both, right? There's concern about big omnibus bills that do too many things and too many committees reviewing them and whatever. that's sort of a machinery of government question right but but the principle that these should be tied together in a narrative that the government is explicit about making and communicating to publics right that if if you we know that 85 percent of canadians want ai to be regulated what do they mean what they mean is at the same time as they're being told by our government by companies that they should be using and embracing this powerful technology in their lives they're also seeing some risks. They're seeing risks to their kids. They're being told their jobs might disappear and might take their... Why should I use this thing? When I'm seeing some harms, I don't see you guys doing anything about these harms. And I'm seeing some potential real downside for me personally and my family. So even in the adoption frame, I think thinking about data privacy, safety, consumer safety, I think to me, that's the real frame here. It's like citizen safety, consumer safety using these products. Yeah, politically, I just, I mean, that is what it is. It makes sense to me.Nate Erskine-Smith36:06-36:25Right, I agree. And really lean into child safety at the same time. Because like I've got a nine-year-old and a five-year-old. They are growing up with the internet. And I do not want to have to police every single platform that they use. I do not want to have to log in and go, these are the default settings on the parental controls.Taylor36:25-36:28I want to turn to government and go, do your damn job.Taylor Owen36:28-36:48Or just like make them slightly safer. I know these are going to be imperfect. I have a 12-year-old. He spends a lot of time on YouTube. I know that's going to always be a place with sort of content that I would prefer he doesn't see. But I would just like some basic safety standards on that thing. So he's not seeing the worst of the worst.Nate Erskine-Smith36:48-36:58And we should expect that. Certainly at YouTube with its promotion engine, the recommendation function is not actively promoting terrible content to your 12 year old.Taylor Owen36:59-37:31Yeah. That's like de minimis. Can we just torque this a little bit, right? So like maybe he's not seeing content about horrible content about Charlie Kirk when he's a 12 year old on YouTube, right? Like, can we just do something? And I think that's a reasonable expectation as a citizen. But it requires governance. That will not – and that's – it's worth putting a real emphasis on that is one thing we've learned in this moment of repeated deja vus going back 20 years really since our experience with social media for sure through to now is that these companies don't self-govern.Taylor37:31-37:31Right.Taylor Owen37:32-37:39Like we just – we know that indisputably. So to think that AI is going to be different is delusional. No, it'll be pseudo-profit, not the public interest.Taylor37:39-37:44Of course. Because that's what we are. These are the largest companies in the world. Yeah, exactly. And AI companies are even bigger than the last generation, right?Taylor Owen37:44-38:00We're creating something new with the scale of these companies. And to think that their commercial incentives and their broader long-term goals of around AI are not going to override these safety concerns is just naive in the nth degree.Nate Erskine-Smith38:00-38:38But I think you make the right point, and it's useful to close on this, that these goals of realizing the productivity possibilities and potentials of AI alongside AI safety, these are not mutually exclusive or oppositional goals. that it's you create a sandbox to play in and companies will be more successful. And if you have certainty in regulations, companies will be more successful. And if people feel safe using these tools and having certainly, you know, if I feel safe with my kids learning these tools growing up in their classrooms and everything else, you're going to adoption rates will soar. Absolutely. And then we'll benefit.Taylor Owen38:38-38:43They work in tandem, right? And I think you can't have one without the other fundamentally.Nate Erskine-Smith38:45-38:49Well, I hope I don't invite you back five years from now when we have the same conversation.Taylor Owen38:49-38:58Well, I hope you invite me back in five years, but I hope it's like thinking back on all the legislative successes of the previous five years. I mean, that'll be the moment.Taylor38:58-38:59Sounds good. Thanks, David. Thanks. This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit www.uncommons.ca
A looming deadline always gets attention, and for DoD suppliers, the clock is ticking. On October 1, 2025, the Department of Defense will begin including Cybersecurity Maturity Model (CMMC) certification requirements in new contracts. This week on Feds At The Edge, four leading experts cut through the complexity and share practical guidance to help you start, or finish, your CMMC journey. Sean Frazier, Federal Chief Security Officer for Okta, explains why “Know Thy Data” is the key to applying the right level of security where it matters most. Alan Dinerman, PhD, Senior Manager, Cyber Strategy, Policy, and Privacy at Mitre, puts CMMC in context with other cybersecurity standards, noting its focus on Controlled Unclassified Information. And Jeff Adorno, Field Chief Compliance Officer at ZScaler, warns of risks in the AI era, where sensitive data can unintentionally “leak” into Large Language Models. The panel as a whole highlights how aligning with existing frameworks and using current technologies can demonstrate progress to auditors and ease compliance. Listen now on your favorite podcast platform because whether you're deep into compliance or just getting started, this conversation will help you navigate the evolving landscape of CMMC and beyond.
Northern Illinois athletic director Sean Frazier talks historic win over Notre Dame full 752 Tue, 10 Sep 2024 20:33:23 +0000 0agQmSXux03eUJ4IClJORHNbI929e1c3 northern illinois huskies,sports Bernstein & Holmes Show northern illinois huskies,sports Northern Illinois athletic director Sean Frazier talks historic win over Notre Dame Dan Bernstein and Laurence Holmes bring you fun, smart and compelling Chicago sports talk with great listener interaction. On Wednesdays, Leila Rahimi joins as a guest co-host. The show features discussion of the Bears, Blackhawks, Bulls, Cubs and White Sox as well as the biggest sports news making waves beyond Chicago. Recurring guests include Bears safety Jaquan Brisker, Pro Football Talk founder Mike Florio, Cubs outfielder Ian Happ and Cubs president of baseball operations Jed Hoyer.Catch the Bernstein & Holmes Show live Monday through Friday (10 a.m.- 2 p.m. CT) on 670 The Score, the exclusive audio home of the Cubs and the Bulls, or on the Audacy app. For more, follow the show on X @BernsyHolmes. 2024 © 2021 Audacy, Inc.
In the final hour, Dan Bernstein and Laurence Holmes were joined by Jenkins Elite founder Tim Jenkins, who provided a deep dive into Bears rookie quarterback Caleb Williams' performance in his NFL debut Sunday. Northern Illinois athletic director Sean Frazier then joined the show to discuss the school's big 16-14 win over No. 5 Notre Dame on Saturday. Later, Bernstein and Holmes listened and reacted to Bears tight end Cole Kmet tell veteran tight end Marcedes Lewis that he has been holding onto his rookie card for the past 19 years.
On Episode 56 of Winners Win, Leaders Lead with Van Malone is Part 2 of Best of Winners Win, Leaders lead featuring some of the best leaders Van has spoken with including Jon Gordon, David Shaw, Todd Bowles, Sean Frazier, Cori Close, Jen Cohen, DeLaina Jordan, Deidre Merritt, Sean Frazier & Erin Mykleby. __ Winners Win, Leaders Lead, was started by Coach Van Malone. He talks to leaders in coaching, athletic departments, business leaders and CEOs. In his over 20 years of coaching, he's been around some of the greatest leaders in his field and had the opportunity to coach thousands of young men and women himself. And now he wants to share that leadership knowledge with you!
@1QLeadership Question: Would an 'NIT type' event make sense considering the new College Football Playoff structure? Sean Frazier, VP & Director of Athletics at NIU, stops by 1Q to discuss a few items on his mind as college athletics continues to evolve. Clandestine Transfer Portal activities Value of bowl games 'NIT type' event tied to the College Football Playoff Antitrust thoughts Positive outlook in college athletics Frazier has led NIU athletics for over a decade and continues to provide insight, mentorship, and leadership in the industry. - One Question Leadership Podcast - Tai M. Brown
Sean Frazier is the Federal CSO at Okta. In this episode, he joins host Charlie Osborne to discuss the reality of identity, including how organizations approach cybersecurity, takeaways from Okta's 2022 State of Zero Trust Security Survey, and more. A trusted partner to businesses around the world, Okta powers identity for the internet by creating great user experiences, increasing customer engagement, improving employee productivity, and getting apps to market faster. To learn more about our sponsor, visit https://okta.com.
There's a new book captivating audiences across the federal technology space and getting rave reviews. "Recoding America – Why Government Is Failing in the Digital Age and How We Can Do Better". Former U.S. Deputy CTO and founder of Code for America Jen Pahlka wrote the book based on her intimate firsthand knowledge of working in and with government agencies for most of the last two decades to improve digital service delivery. And in it, she concludes that there are much deeper issues that are leading to the failure of the U.S. government in the digital age …Or as the book's synopsis puts it: “we must stop trying to move the government we have today onto new technology and instead consider what it would mean to truly recode American government.” In web application security, compromised credentials are responsible for 80% of breaches. This has highlighted the need to replace passwords with more effective security methods. Passwordless authentication methods offer a faster, more secure, hassle free way to protect our digital lives. In a new interview, Sean Frazier, federal chief security officer at Okta, discusses the advances and their use cases. The Daily Scoop Podcast is available every Tuesday and Thursday afternoon. If you want to hear more of the latest from Washington, subscribe to The Daily Scoop Podcast on Apple Podcasts, Google Podcasts, Spotify and Stitcher.
On the question of NCAA activism, Northern Illinois VP of Intercollegiate Athletics, Sean Frazier returns to @1QLeadership to ponder what the collective group of institutions that make up the governing body of college athletics is doing to counteract the trend of states banning DEI positions, banning books, and other actions that seem to violate the social agreement. Frazier mentions past activism actions in the Carolinas among other places as a standard that should continue. - One question Leadership Podcast - Tai M. Brown
Sean Frazier, VP & Director of Athletics at NIU, visits @1QLeadership to discuss his idea for mitigating the potential educational impact of entering the transfer portal. Frazier proposes a solution that finances the remaining degree requirements for those who enter but don't sign on with a new school. He also gives his insight on why ADs went from silence to actively promoting NIL collectives on their respective campuses. Also joining the conversation is Lauren Bullock of Stretch.
0:00 - Rod Blagojevich fills in for Dan Proft 12:57 - Is Pres Trump becoming unhinged? His Truth Social postings demanding a NEW election NOW 29:02 - An about face for Pres Biden and the Democratic “talking points.” NOW they want to FUND the Police 47:58 - Dan Brady, Illinois State Representative for the 105th District shares the improvements he will make if elected Secretary of State. For more on Dan's run for Secretary of State visit votedanbrady.com 01:01:21 - Kevin R. Brock, former assistant director of intelligence for the FBI and former principal deputy director of the National Counterterrorism Center: Unsealed Mar-a-Lago search warrant affidavit reveals the government has no case against Trump 01:13:55 - Noted economist Stephen Moore has noticed a lot of anger over student loan forgiveness. Steve has a new book out too Govzilla: How the Relentless Growth of Government Is Devouring Our Economy—And Our Freedom 01:29:09 - Sean Frazier, Associate Vice-President and Director of Athletics at Northern Illinois University, previews the 2022 Husky Football Season which you can listen to live on WIND AM 560 The Answer. season opener is Thursday, September 1st at home against Eastern Illinois University, WIND will have pre-game starting at 6:30pm and the kickoff at 7:00pm 01:40:32 - ASK ROD!!!See omnystudio.com/listener for privacy information.
Mike Gaspar, formerly of Moonspell, stops by the show to talk all about his new band Seventh Storm. Mike has a lot to say, so make sure you check this one out. Also, we spend sometime with the author of a killer book titled Rock n Roll Children - An 80s Hair Metal Garage Band Story, Sean Frazier. Sean's story sounds so familiar to anyone who dreamed of making it as rock star back in the day. Plus, we have music by the following bands:The Retaliators, Michael Schenker Group, King Bull, Owls & Aliens, Seventh Storm, Glazer Girl, Lovesick Radio, The KillerHertz & Last in Line Join Randy and Troy, for this and every episode of Ouch, You're on my Hair and subscribe to the show on ApplePodcasts, Spotify, iHeartRadio, Amazon Music, Podomatic, Podbean, and more. You can find them on Instagram, Twitter and Facebook as well.
Join us as we welcome Sean Frazier to Behind the Broom. Sean has spent the last 30 years of his career with Environment Control, from working as a part-time cleaner in college to successfully operating his own franchise for the last 14 years. Click the play button to hear a true testimony of what "It's About Lives" really means.
Federal Tech Podcast: Listen and learn how successful companies get federal contracts
The Cybersecurity and Infrastructure Security Agency recently highlighted the five pillars of a maturity model when it comes to Zero Trust: Identity, Device, Network, Workload, and Data. There is no accident that the first pilar is identity. Sean Frazier is a well-known expert in identity management. During this interview, he provides a perspective about this topic that ranges from compliance to assistance in proposal writing. One can argue that this is an isolated emphasis until you realize that the OMB Memo 22-09 talks about centralized identification, multifactor identification, and device signaling, It would seem reasonable to conclude that an effective identity management system is a key component in making sure today's dynamic federal hybrid cloud is safe.
As organizations begin to transition out of pandemic response, many will use the zero-trust architecture adopted during remote work to continue bolstering their security posture. According to one chief security officer, this means prioritizing endpoint protection. “If you think about the core pillars and constructs of zero trust, one of the more important ones is the endpoint,” says Sean Frazier, federal CSO at Okta. “Almost everything these days with regard to endpoint is mobility.” Guest: Sean Frazier, Federal Chief Security Officer, Okta Host: Francis Rose, Host, The Daily Scoop Podcast This podcast was produced by Scoop News Group for The Daily Scoop Podcast and underwritten by Okta.
Joining this week's episode of "From the Chair" is NIU AD Sean Frazier. Sean and I dive in on his work nationally around DEI over the last two years and his insights on growth in this area - both in the student athlete arena and well as in administrative leadership roles. We also talk about his personal growth as a leader during the pandemic and with a major change in his football program. We discuss his candid and insightful thoughts on the portal, NIL and the D1 Transformation Committee and then end with a personal anecdote about our shared connection with having a child on the spectrum. This a good one...listen in. See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
In this week's episode, we sat down with NIU coaches Thomas Hammock, Connie Teaberry and Athletic Director Sean Frazier to discuss Black History Month, race in college athletics and furthering the conversation within the community.
Andy Katz talks with the 2022 NCAA/MOAA Award for Diversity and Inclusion winners – Northern Illinois University representatives Sean Frazier, vice president and director of athletics and recreation and Courtney Vinson, senior associate athletics director.
The latest on mid-major realignment with Belmont reportedly departing the OVC, contract details for Northern Illinois' Sean Frazier and more. Be sure to check your inbox to see more of today's news and notes from around the nation. We would love to know what you think of the show and you can let us know on social media @D1ticker. If you are not subscribed to D1.ticker, you can and should subscribe at www.d1ticker.com/.
This week, host Bob Wallace is joined by Sean Frazier, Athletic Director at Northern Illinois University. Sean reflects on the current state of diversity in sports, both on the administration and coaching side.
Sean Frazier, federal CSO at Okta, discusses identity management, lifecycle management, and some of the challenges of identification for Robotic Process Automation.
Telework is here to stay for the foreseeable future and state and local agencies continue to build on their efforts to support the remote workforce. One of the security questions that should be top of mind for IT leaders is whether their employees are accessing critical applications from trusted devices. In this podcast, security experts discuss how the pandemic is testing the boundaries of IT security, including authentication. Sponsored by Duo Security. Guests: Sean Frazier, Advisory CISO, Federal and Bart Green, VP, State, Local and Education at Cisco’s Duo Security Host: Wyatt Kash, SVP, Content Strategy, Scoop News Group Look for more from the "Speed to Security" series on www.statescoop.com/listen
Duo's CISO Advisory team members are legends in their fields. They have seen it all, and they are ready to share their insights with you!If two crows make an attempted murder, then a group of Advisory CISOs surely make up a Murder Board. A Murder Board is a group of people pulled together to provide critical review. The idea is to prepare someone for a difficult situation such as a presentation, or for our meaning, your career in security.This month, the CISO Murder Board, comprising our Duo Advisory CISOs Dave Lewis, Richard Archdeacon, Sean Frazier and Wolf Goerlich) take on the topic of remote work and look to the future. You’ll hear:A lively around-the-horn of hot topics and current eventsA conversation on the state of remote workHow remote work requires a culture changeThe future of work and where we go from hereHow zero trust can lead to remote work cost savingsTune in now.Learn more at Duo.comTry Duo For FreeWith our free 30-day trial you can see how easy it is to get started with Duo and secure your workforce, from anywhere and on any device.
In the second hour, Mike Mulligan and David Haugh recapped a busy weekend in Chicago sports, discussing the White Sox's series loss to Cleveland and the Blackhawks' upset of the Oilers in their qualifying-round series. Score baseball insider Bruce Levine then joined the show to discuss the latest Cubs and White Sox storylines. Later, Northern Illinois athletic director Sean Frazier joined the program to discuss why the MAC chose to cancel its football season amid the pandemic this fall.
Original air date: July 14, 2020 Sports have frequently been recognized as a vehicle for bringing people together despite individual differences. As our country continues to navigate the social injustices in society, intercollegiate athletics plays an important part in the journey to educate and influence positive change through the younger generation student-athletes. While diversity and inclusion plans may have previously existed in athletics departments, putting those plans into action is more important than ever. In this session, panelists Mike Aresco, Commissioner, American Athletic Conference; Sean Frazier, Associate Vice President & Director of Athletics, Northern Illinois University; and Renae Myles Payne, Senior Associate AD for Administration, University of Miami, will discuss a few case studies within the industry highlighting D&I plans that departments have executed as intercollegiate athletics seeks to be a catalyst for change.
In the final hour, David Haugh and Zach Zaidman were joined by Northern Illinois athletic director Sean Frazier to discuss the problems that college sports are facing amid the coronavirus pandemic. Rich King then joined the show to discuss working White Sox webcasts. Later, 670 afternoon host Dan McNeil joined the program to remember the life and legacy of Seth Mason, who passed away at 71 on Saturday. Mason was one of the founders of Chicago radio stations WXRT and The Score, and McNeil explained how Mason was an instrumental part of developing flagship radio in Chicago.
With all the uncertainty in the world these days, Northern Illinois University athletic director Sean Frazier talks about the past, present and future as he attempts to maneuver the college sports world during a pandemic. See omnystudio.com/policies/listener for privacy information.
With all the uncertainty in the world these days, Northern Illinois University athletic director Sean Frazier talks about the past, present and future as he attempts to maneuver the college sports world during a pandemic. See omnystudio.com/policies/listener for privacy information.
I sit down with Brandon Hassell (@bhass23) and Sean Frazier (@mr_theoo) discussing current events regarding racism, the random hangings, and interracial dating. Also, to see if my wife listens, I pronounce the word subtle ‘sub-tle’ lets see if she catches on. --- This episode is sponsored by · Anchor: The easiest way to make a podcast. https://anchor.fm/app · Anchor: The easiest way to make a podcast. https://anchor.fm/app
Joe Ostrowski was joined by Northern Illinois athletic director Sean Frazier to discuss how college athletics are dealing with the coronavirus pandemic and civil unrest and how that will shape them moving forward. Frazier also discussed some of the issues that current and future athletes face on and off the field.
State and local agencies can make it easier and safer for their employees to work remotely, with identity and access solutions that validate the security of devices connecting to the network. Increasing the degree of visibility on how remote devices are configured will go a long way to reduce network risks, especially for those who may need to use their own devices, say security experts. Sponsored by Duo Security. Guests: Bart Green, Vice President, State, Local and Education markets, Duo Security and Sean Frazier, Advisory CISO, Federal, Duo Security Look for more coverage of the “Speed to Security” series on www.statescoop.com/listen
It’s a particularly difficult time for state and local agencies, some of which have moved to almost 100% remote work over the course of days. The number of users accessing government networks and applications from outside the organization’s perimeter has created an urgent security problem. As CIOs respond to this new surge in remote work, they have an opportunity to deploy future-ready security solutions that can be quickly implemented and managed from the cloud. Sponsored by Duo Security. Guests from Duo Security: Bart Green, VP, SLED and Sean Frazier, Advisory CISO, Federal Look for more coverage of “Speed to Security” on www.statescoop.com/listen
As more government employees work remotely, agency IT teams can rapidly secure access at scale with cloud-based identity authentication, says a cybersecurity veteran. Sponsored by Duo Security, now a part of Cisco. Guest: Sean Frazier, Advisory CISO, Duo Security Look for more coverage of “IT Modernization in Government” on www.fedscoop.com/listen
In his last hour, Laurence Holmes was joined by Northern Illinois athletic director Sean Frazier to discuss his role during this unusual and troubling time amid the coronavirus pandemic. Later, Siafa Lewis of NBC 5 joined the show to partake in the Whatcha Watching segment.
Duo's CISO Advisory team members are legends in their fields. They have seen it all, and they are ready to share their insights with you!If two crows are an attempted murder, then a group of Advisory CISOs surely make up a Murder Board. A Murder Board is a group of people that are pulled together to provide critical review. The idea is to prepare someone for a difficult situation such as a presentation, or for our meaning, your career in security.In this month’s episode of the CISO Murder Board, our panel of intrepid Duo Advisory CISOs (Dave Lewis, Richard Archdeacon, Sean Frazier and Wolf Goerlich) tackles timely topics including:Perspectives on patching when teams are working remotelySunshine and puppies (aka exercising while working from home)How physical locations are being securedCyber threats taking advantage of people being away from their officesDad jokes (of course) to lighten the mood a bitTune in now.Learn more at Duo.comCheck out our free security toolkit!
Sean Frazier, Associate VP & Director of Athletics at Northern Illinois, visits @1QLeadership to give insight on the questions ADs may be asking themselves during the coronavirus chapter. One of the key items covered is how to operationalize and provide resources for some of the ideas being discussed regarding the future. He also heavily emphasizes his concern for the health and well-being of the Student-Athletes who have had their routines abruptly halted.
Sean Frazier is an emerging voice in the world of college athletics. Well respected for his student-centered approach in a world of bigtime college athletics, Frazier is the voice that we need to hear as the world of sports has been shaken by the COVID-19 pandemic.
Welcome to our inaugural podcast! Our Duo CISO Advisory team members are legends in their fields. They have seen it all, and they are ready to share their insights with you!If two crows are an attempted murder, then a group of Advisory CISOs surely make up a Murder Board. A Murder Board is a group of people that are pulled together to provide critical review. The idea is to prepare someone for a difficult situation such as a presentation, or for our meaning, your career in security.This podcast will start out as a monthly endeavor. The hosts are Dave Lewis, plus Richard Archdeacon, Sean Frazier and Wolf Goerlich.The Duo CISO Advisors will be discussing security issues that pertain to the business of running a security practice, living life as a CISO and current events of the day packed with humor, knowledge and grace.Learn more at Duo.comCheck out our free security toolkit!
UN rapporteurs say that the Saudi Crown Prince was probably involved in the installation of spyware on Amazon founder Jeff Bezos’s personal phone. Brazilian prosecutors have indicted Glenn Greenwald, co-founder of the Intercept, on hacking charges. IBM describes a renewed NetWire campaign, and Microsoft says StarsLord is back, too. And in cyberspace, there’s nothing new on the US-Iranian front. Ben Yelin from UMD CHHS on surveillance cameras hidden in gravestones. Guest is Sean Frazier from Cisco Duo on their most recent State of the Auth report. For links to all of today's stories check our our CyberWire daily news brief: https://thecyberwire.com/issues/issues2020/January/CyberWire_2020_01_22.html Support our show
How a tech partnership enhances federal government agency efforts to use unified identity and access controls to make multi-cloud adoption more secure. Sponsored by Cisco Systems. FedScoop interview with Jamie Sanbower, principal architect at Cisco Systems, and Sean Frazier, advisory chief information security officer for federal government at Duo Security. Moderated by Wyatt Kash for FedScoop. Look for more coverage of “IT Modernization in Government" on www.[fedscoop.com/listen.
Success Beyond Winning is the topic of conversation between Sean Frazier, AD at NIU, John Hartwell, AD at Utah State, and Learfield/IMG Executive VP and former AD, Mike Hamilton. The three touch on defining success beyond the easy evaluation points of wins and losses. Frazier and Hartwell talk about internal versus external messaging when discussing athletic expectations, insight on fans and realistic expectations, and the group wraps the discussion on an aspirational note.
Laurence Holmes was joined by one of his favorite guests, Northern Illinois athletic director Sean Frazier. The two chatted about the Huskies' upcoming football season, new head coach Thomas Hammock and why Frazier is so energetic. See omnystudio.com/policies/listener for privacy information. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Listen in to Beyonce's 2-decade-long stylist Ty Hunter share wisdom about life and business. He chats everything from getting in the room with celebrities to how he handles depression. To book Ty please contact his manager Sean Frazier via email below:sean@smfglobal.comIr you enjoyed this episode, please take a screen shot and tag us on IG or Twitter:Ty's Instagram (@tytryone)Gaynete's Instagram (gaynete)
Government networks remain vulnerable to breaches from phishing attacks. But security experts outline solutions to tighten identity authentication controls that enable agencies to move towards achieving a zero-trust security model. Sponsored by Duo Security. Guests: Dean Scontras, vice president of public sector and Sean Frazier, advisory chief information security officer with Duo Security, a division of Cisco
David Ordan is the Development Director for the Eisenhower Center in Milwaukee. He joined Dan and Amy to talk about "Ike's Bites Dog Treats" and the work they do to provide people with disabilities with opportunities to learn collaboration, marketing, quality control, business management, and money management. Plus,Sean Frazier is the Associate Vice-President and Director of Athletics at Northern Illinois University joined Dan and Amy with a preview of football season. See omnystudio.com/listener for privacy information.
Should we be expecting another great season from the Huskies and a potential bowl game on the horizon? How are some of the local kids shaping up and how much do they recruit from Chicago? Their competitive schedule begins this weekend against the Iowa Hawkeyes. Associate Vice-President and Director of Athletics at Northern Illinois University, Sean Frazier joins Dan and Amy to discuss.
Sean Frazier, Associate VP & AD at NIU, visits @TaiMBrown and the 1.Question Podcast at the 2018 NACDA Convention. Frazier discusses the tumultuous environment in society and its effect on leadership in athletics. He mentions being a constant educator on what's happening, what are the potential outcomes, and what can be done now to prepare for future issues. Emphasizing that there is no substitute for daily preparation, Frazier also talks about athletics continuing to be the on the forefront of social change.
What is the Zero Trust Security model and how will it help federal IT professionals protect the data at their agencies? Find out when Sean Frazier, Advisor CISO – Federal at Duo Security, joins host John Gilroy on this week’s Federal Tech Talk.
Sean Frazier, Director of Athletics at Northern Illinois University, discusses how his experience with college ice hockey as an NCAA D2 & D3 AD helped propel him to his current position. He roots his leadership philosophy in getting the right people in the right positions, developing them, and empowering them to be able to do the job on a daily basis. Speaking about his mentors, Frazier is very hearty in his appreciation for his development under Barry Alvarez and others he has learned from over his career.
Sean Frazier, Athletic Director at Northern Illinois University, sits down with 1.Q contributing host Tai Brown at the 2017 NACDA Convention. Frazier discusses his early beginnings as a football coach and how/why he made the transition to administration. Frazier also touches on his time working for Wisconsin AD Barry Alvarez.
The latest episode has a fun interview with Australian shooting guard Courtney Woods about life down under, and the guys rave about how fun it is to watch the NIU women's basketball team compared to watching the men's team and debate about NIU athletic director Sean Frazier's comments regarding the Group of 5 having a football playoffs.
Northern Illinois AD Sean Frazier goes in on the need for a serious look at a 'Group of 5' National Championship. Frazier's position stems from financial challenges for G5 institutions & the potential for TV & corporate sponsorship revenue to buoy bottom lines. Additional points of discussion center around brand exposure, fan base engagement, the reality of a 'Power 6' push, if 'Power 5' institutions would be in favor of such a move, timing from a TV programming standpoint, scheduling considerations & more.
Northern Illinois AD Sean Frazier talks about the possibility of the NCAA applying the Rooney Rule (or something similar) when hiring.
Associate vice president and director of athletics Sean Frazier discusses the successful Victor E Ball event and recent accomplishments of Huskie student-athletes both on the fields of competition and in the classroom.