Podcasts about Communications Decency Act

Attempt by the United States Congress to regulate pornographic material on the Internet

  • 369PODCASTS
  • 564EPISODES
  • 42mAVG DURATION
  • 1EPISODE EVERY OTHER WEEK
  • Jun 12, 2025LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about Communications Decency Act

Latest podcast episodes about Communications Decency Act

Indisputable with Dr. Rashad Richey
Social Media Platforms At Risk Amid Potential Section 230 Rollback

Indisputable with Dr. Rashad Richey

Play Episode Listen Later Jun 12, 2025 11:27


Sam Raus joins the Bullpen to discuss the efforts to roll back clauses in the Communications Decency Act of 1996 that would likely crush smaller social media platforms.  Host: Dr. Rashad Richey (@IndisputableTYT) Bullpen guest: Sam Raus *** SUBSCRIBE on ⁠⁠YOUTUBE⁠⁠  ☞ ⁠⁠ https://www.youtube.com/IndisputableTYT⁠⁠ FOLLOW US ON: ⁠⁠FACEBOOK⁠⁠  ☞ ⁠⁠  https://www.facebook.com/IndisputableTYT⁠⁠ ⁠⁠TWITTER⁠⁠  ☞     ⁠⁠  https://www.twitter.com/IndisputableTYT⁠⁠ ⁠⁠INSTAGRAM⁠⁠ ☞ ⁠⁠ https://www.instagram.com/IndisputableTYT⁠ Learn more about your ad choices. Visit megaphone.fm/adchoices

Relationship Insights with Carrie Abbott
Section 230 Needs to Be Repealed!

Relationship Insights with Carrie Abbott

Play Episode Listen Later Apr 22, 2025 28:01


This year, instead of highlighting 12 companies that facilitate sexual exploitation, the National Center on Sexual Exploitation is highlighting 12 survivors who were denied justice in the courts because of Section 230 of the Communications Decency Act. Legal Counsel for NCOSE Tori Hirsch joins us with the important facts. Dirty Dozen List 2025 - NCOSE (https://endsexualexploitation.org/dirty-dozen-list-2025/)

Law, disrupted
Tech Law Insights: Ben Lee's Extraordinary In-House Career

Law, disrupted

Play Episode Listen Later Apr 10, 2025 49:40


John is joined by Ben Lee, Chief Legal Officer of Reddit.  They discuss Ben's extensive career as a senior in-house lawyer in several of the most successful tech companies in the world.  After earning degrees in physics and economics, Ben worked at IBM's research lab, where he was intrigued by the way lawyers grappled with the impacts of technology on society.  Ben then went to law school and began his career as a litigator at a New York law firm but left to work at the Legal Aid Society.  Financial realities eventually led him back to private practice and then to a career in-house.  At AT&T and NEC, Ben worked closely with pioneering computer scientists and handled complex IP matters involving emerging technologies like machine learning and AI.  When he moved to Google, Ben advised on major projects like Chrome, Android, and Google Cloud at very early stages when their success was far from assured.  Ben later joined Twitter during its early, fast-paced growth phase, managing litigation, IP, employment, and regulatory issues.  He led Twitter's lawsuit against the U.S. government over transparency for national security requests.  Later, at Airbnb, Ben tackled challenging regulatory landscapes worldwide, and at Plaid, he advocated for consumers' rights to financial data.  At Reddit, Ben now oversees all legal functions for a vast online platform with over 100,000 user-created and moderated communities.  Section 230 of the Communications Decency Act is vital to Reddit's success.  It provides that online users and platforms are generally not liable for content created by others.  Section 230 protects Reddit's content moderation decisions, the decisions of its volunteer community moderators and its individual users.  Finally, Ben advises young in-house lawyers to remember that their job is not to just point out all potential legal risks in a project, but to help their teams manage those risks so they can build great products and move companies forward.Podcast Link: Law-disrupted.fmHost: John B. Quinn Producer: Alexis HydeMusic and Editing by: Alexander Rossi

The Leading Voices in Food
E269: Children, screen time and wellbeing - many reasons for concern

The Leading Voices in Food

Play Episode Listen Later Apr 9, 2025 39:38


The amount of time children and adolescents spend with a screen is absolutely stunning. Lots of people, including parents, health leaders, educators, elected leaders from both parties I might mention, and even children themselves, are highly concerned and are discussing what might be done about all this. I'm delighted to begin this series of podcasts on children and screen time. Today we're welcoming two very special guests who can talk about this topic in general, and especially about what's being done to protect children and adolescents. Several podcasts will follow this one that deal with food and nutrition in particular. Our first guest, Kris Perry, is Executive Director of Children and Screens, an organization devoted to protecting children. In the digital world by addressing media's impact on child development, communicating state-of-the-art information, and working with policymakers. Prior to joining children in Screens, Kris was senior advisor of the Governor of California and Deputy Secretary of the California Health and Human Services Agency. Our other guest, Dr. Dimitri Christakis is a professor of pediatrics at the University of Washington School of Medicine, and director of the Center for Child Health Behavior and Development at Seattle Children's. He's also editor-in-chief of JAMA Pediatrics and both Chief Scientific Officer and Chair of the Scientific Advisory Board of Children and Screens. He's also the co-editor of a new book that I'm very excited to discuss. Interview Summary Download The Handbook of Children and Screens: https://link.springer.com/book/10.1007/978-3-031-69362-5 Kris, let's start with you. Could you set the stage and give us some sense of how much time children spend in front of screens, children and adolescents, and what devices are being used and what kind of trends are you seeing? Yes, I'd be happy to. I had better news for your listeners, but as you might imagine, since the advent of the smartphone and social media, the youth digital media use has been increasing each year. Especially as children get older and have increasing demands on their time to use screens. But let's just start at the beginning of the lifespan and talk about kids under the age of two who shockingly are spending as much as two hours a day on screens. Most spend about 50 minutes, but there's a significant chunk spending up to two hours. And that rises to three or three to five hours in childhood. And eventually in adolescence, approximately eight and a half hours a day our adolescents are spending online. Also wanted to talk a little bit about middle childhood children, six to 12 years of age. 70% of them already have a social media account, and we all know social media wasn't designed for children. And there are restrictions on children under 13 using them, and yet children six to 12 most have an account already. Over half of four-year-olds have a tablet and two thirds of children have their own device by the age of eight; and 90% of teens. This probably won't be surprising, and yet we should really think about what this means; that 90% of teens are using YouTube, 60% are on TikTok and Instagram, and 55% use Snapchat. I'll stop by ending on a really alarming statistic. Oh my, there's more? There's more. I know it! I told you. I'll be the bearer of bad news so that we can talk about solutions later. But, children are checking their devices as often as 300 times per day. 300 times. 300 times per day, and we're talking about screen time right now. And we know that when you're using time to be on screens, you are not doing something else. And we know that childhood is full of challenges and skill building and mastery that requires repetition and tenacity and grit and effort. And the more children are on their screens, whether it's social media or other entertainment, they're not doing one of these other critical child development tasks. That's pretty amazing. And the fact that the older kids are spending more time on before a screen than they are in school is pretty alarming. And the younger, the really youngest kids, that's especially alarming. So, Dimitri, why should we fret about this? And I realize that fret is kind of a mild word here. Maybe all I'll panic would be better. But what are some of the major concerns? Well, I don't think panic is ever the right reaction, but the numbers Kris conveyed, you know, I think do paint a, let's say, concerning story. You know, the simple reality is that there's only so much time in a day. And if you think about it, teenagers in particular should sleep for eight to 10 hours a day at a minimum. They really should be in school six and a half, seven hours a day. And then when you add the numbers, Kris conveyed, you realize that something's giving because there isn't enough time left to spend eight and a half hours a day. The two things at a minimum that are giving are sleep. Kids are losing sleep to be on screens. And I'm sorry to say that they're losing school while they're on screens. We just published a paper that used passive sensing to see where and when children are on their screens. And found that the typical child in the United States spends an hour and a half during the school day on their device. And it's not, before any of your guests ask, on Wikipedia or Encyclopedia Britannica. It's on the usual suspects of social media, TikTok, etc. So, you know, we talk about displacement, and I think it's pretty obvious what's being displaced during school hours. Its time focused on learning if it's in the classroom, and time focused on being authentically present in real time and space if it's during recess. School hours are precious in that way, and I think it is concerning that they're spending that much time in school. And I told you the median. Of course, some kids are above that, a significant half of them are above it. And at the high end, they're spending 30 to 40% of school time on screens. Now, some schools have enacted policies. They don't typically enforce them very well. One of the things that drives me nuts, Kelly, is that as an academic, you know we love to argue amongst ourselves and hem and haw. And this issue about whether or not there's such a phenomenon as digital addiction is still being hotly debated. Honestly, the only behavioral addiction that's being seriously considered at this point is gaming disorder. The DSM-5 didn't consider gaming, considered it, but didn't include, it said it needed further study in 2013. In 2022, the WHO did include gaming disorder as an ICD-11 diagnosis. But just as further evidence how slow science is compared to technology., I mean gaming, while it's still an entity, represents a small fraction of most people's screen time. And the numbers that Kris conveyed, a small fraction of that for some on average was gaming. For some people, it's their screen use of choice, but for many, it's social media. YouTube, although I consider YouTube to be a social media, etc. And at the high end when you hear the numbers Kris conveyed in my mind that's a behavioral addiction any way you define it. Well, and if you think about things that we all agree are addictive, like nicotine and alcohol and heroin, people aren't doing it 300 times a day. So it's really pretty remarkable. And that's exactly right. One of the salient criteria for those addictions is that it's interfering with activities of daily living. Well, you can't be on a screen for nine hours a day when you're supposed to be asleep for 10 and at school for six without interfering with activities of day. The math isn't there. And things like being physically active and going out and playing. That's right. It doesn't add up. So, you don't need the DSM-5. You don't need a psychiatrist. You need a mathematician to tell you that there's too much time on this thing. Alright, so Kris, talk to us if you will, about the Children and Screens organization. I have a lot of respect for the organization and its work. Tell us how it got started and what its objectives are. Well, it's so great to be on this show with you and get to see you in your day job, Kelly. Because you've been an advisor, like Dimitri, to the institute almost since its inception, which is in 2013. As you know, our founder, Dr. Pamela Hurst-Della Pietra, really became concerned as a parent about the way digital media was impacting her children and sought out some answers. Well, what does this mean? Why is this happening? What should I do? And found out that this, of course, is 2013, this is a long time ago. There wasn't that much research yet. And it was multidisciplinary. In other words, there might be a study among neuroscientists or developmental psychologists, even ophthalmologists. But there really hadn't been, yet, a concerted effort to bring these different disciplines and the research together to try to answer some of these hard questions about the impact on kids. And lo and behold, here we are, almost 13 years since the advent of the smartphone and social media. And there is an astounding amount of research across disciplines. So, what we do at the institute is we try to translate it as fast as we can and make it actionable for parents, providers, and policy makers. And we do that through our Ask the Experts webinar series where we bring the experts themselves directly to our audience to talk about these impacts and answer questions. We also create printables, you might say, like tip sheets and Research at a Glance Digest, and newsletters and FAQs and we've upgraded our website to make it very navigable for parents of kids of all ages. I even started my own podcast this year, which has been really fun. Dimitri was my first guest, so it's great to see him here. And we have convenings. We're having our third Digital Media Developing Mind Scientific Congress this summer where the experts come together in person to discuss issues. And we really try to focus them on advancing research and supporting it, translating it, and positioning the issue as a policy priority. We'll be in Washington, DC where we know lawmakers are grappling with the impact of digital media on child development, how to make online, products safer for kids and protect their data. The Institute is in the middle of all of this, trying to facilitate more discussion, more results and more support for parents primarily. Kris, a couple of things occur to me. One is that the breadth of work you do is really very impressive because you're not only having very hands-on kind of in the real world ex advice for parents on how to navigate this world, but you have advice for and helpful resources for policy makers and for researchers and people. It's really quite an impressive breadth of work. The other thing that occurred to me is that I don't think you and I would have any podcast career at all if it hadn't been for Dimitri helping us out. So thanks Dimitri. Yeah. So, let me ask you, Dimitri, so I know that both you and Kris are committed to an evidence-based approach to making policy. Yeah. But technology advances way more quickly than scientists can evaluate it. Much less come up with policies to deal with it. And by the time research gets funded, completed, published, you're on to eight new levels of technology. So how does one handle this fundamental problem of pace? It's a really good question. I mean, I can tell you that we should at a minimum learn from the mistakes we've made in the past. And, you know, one of the most critical, frankly, that most people don't really understand is that we talk about the age at which children get social media accounts in this country. Kris pointed out that actually pre-teens routinely have social media accounts. Social media companies do very little to age gate. They're trying to do more now, but even the age at which we've accepted it is being normative is 13. Few people know where that comes from. That doesn't come from talking to pediatricians, psychologists, parents about what age is the appropriate age. It comes entirely from COPPA (Children's Online Privacy Protection Act), which basically was the original privacy act that said that before the age of 13, companies could not collect data from children. So, because these companies were interested in collecting data, they set the age at 13 so as to not have any constraints on the data they collected. Well, that's not even common sense-based policy, let alone evidence-based policy. And it's never been revisited since. It's very troubling to me. And as things move forward, I think we have to learn from those mistakes. Medicine has a maxim which is do no harm. We use that phrase a lot and I think it's a good one in this case. I think it's a particularly good one as we see the new technologies emerging around artificial intelligence. And you know, again, like any new technology, it has incredible upside. We made the mistake and we're still paying for it, about not appreciating the downsides of social network sites, and frankly, the internet in general. And I would hope we put guardrails in place now. And if you will apply the same standard we apply to other non-technology based products. You can't introduce a new pharmaceutical to anybody, let alone to children, until you show it's safe and effective. You can't bring toys to the world that are dangerous. Why do we have more safety precautions around toys than we do around websites for children? You know, a lot of it involves changing defaults, doesn't it? Because if the default is that government or somebody out there has to prove that something is harmful before it gets taken away. That changes everything then if you began at a different point where these companies have to prove that these things are safe. Correct. Or they're permitted. Then the companies would find workarounds and they would play games with that too, but at least that would help some. Well, it would help some. And at least we'd be philosophically in the right place. By the way, Kris didn't say it, so I'll say it. You know, the mission of Children and Screens, lest we sound like Luddites here, is not get kids away from technology. Take away their smartphones. We all recognize that technology is here to stay. I think all of us appreciate the incredible upside that it brings to children's lives. The mission of Children and Screens is to help children lead healthy lives in a digital world. And part of the reason she and I often talk about the concerns we have is because the pros make the case for themselves. I mean, you know, no one needs to come here and tell you how amazing it is that you could Google something or that you could get somewhere with GPS. I mean, we know it's amazing and we all rely on it. And none of us are ever talking about getting rid of that stuff. That makes good sense. It's like, you know, children benefit from the fact that they can get around with their parents in the automobile. But you want to have car seats in there to protect them. Exactly. And that's exactly right. There needs to be assurances of safety and they're none. I mean, they're really virtually none. The age getting is a joke. And even if we accept it as effective, the age set of 13 is too young, in my opinion. We started this conversation talking about these medias being addictive, I believe they're addictive. There are legitimate academics that will debate me on that, and I'm happy to join that debate. But as I said before, it's a tough argument to win when people spending upwards of 10 to 16 hours a day doing it. I don't know what you call that besides addictive. We can argue about what percentage are doing that, but nevertheless, once you accept something as addictive, for other addictive things we immediately age gate it above 18 or 21, right? Mm-hmm. We don't believe that the teenagers have the ability to regulate their alcohol or tobacco or gambling, all of which we accept are addictive. In fact, in the case of alcohol, we raised the age from 18 to 21 because we thought even 18-year-olds weren't able to do it. And yet somehow for this behavior, we think of it as just so different that it doesn't require greater cognitive capacity. And I don't believe that. Yeah, very good point. Kris, let me ask you a question about how you and your colleagues at Children and Screens set priorities because there are a lot of things that one could potentially worry about as outcomes. There's violence that kids see on social media. There's cognitive and brain development, social developments, social interactions, and bullying. Mental health, body image, diet, all these things are out there. How do you decide what to work on? Well, we try to work on all of it. And in fact, we've built up a fair amount of expertise and resources around almost 25 different topics. And we also understand that, you know, childhood is a long period of time. Birth to 18, birth to 21, birth to 25, depending on who you talk to. So, we're able to take those 25 topics and also provide deeper, you might say, resources that address the different stages of development. We're really trying to do as much as we can. What's been interesting over these last few years is trying to figure out when to be reactive, when to be proactive. And by being proactive, we go out looking for the research, translating it, digesting it, and creating materials with it that we think are really accessible and actionable. At the same time, as Dimitri points out, there are policy windows and there are opportunities that present themselves that you have to react to. If you just only talk about what you want to talk about to each other you're missing some of these external opportunities to inform policy and policy makers. Help influence the way that parents and providers are talking about the issue. Framing it in such a way that engages youth and makes them want what we want for them. We're really excited by increasing opportunities to partner in coalitions with others that care about kids and teachers and nurses and doctors. But we also are speaking directly to leaders in states and school districts at the federal level, at the local level. You would be, I'm sure, not surprised to hear that we are contacted every day by groups that support parents and families. Asking for resources, asking for support, because they're seeing the impact now over many years on their children, their development. Their academic ability. Their cognitive and analytical ability. Their social emotional ability. Their ability to pay attention to tasks that we all know are critical in building that foundation for essentially, you know, future success. The Institute is being pulled in many directions. Ee try really hard to be strategic about what are people asking us for? What does the research say and how can we get that to them as quickly as possible? Dimitri - Can I add to that? You know, I want to emphasize that the concern around the effects of screen use on children's lives is shared by parents on both sides of the aisle. 75% of parents are concerned about the impact of screens on their children's lives. 35% of teenagers are concerned about their dependents on screens and that it has a negative effect on their lives. Actually by some studies, some surveys, even more than 35 to 50% of teenagers are concerned. And both sides of the political aisle agree in large part of this. And Kris and Kelly, you guys are the policy wonks, you can speak more to that. So it's a serious indictment on us as grownups and as a society that we have not done more to deliver on this issue. Why? When there's bipartisan agreement amongst many policymakers. This is not a political [00:22:00] issue to speak of and there is widespread concern on the part of parents and even teenagers. Why is nothing happening? Well, one has to look no further than where the money is. And that's a problem. I mean, that's a serious indictment on our political system when we can't deliver something that is needed and basically wanted by everybody but the industry itself. We'll come back and talk in a few moments about the policy issues and where industry gets involved here. But let me take just a bit of a detour from that and talk about the book that I mentioned earlier, because I think it's such a valuable resource. Now, when I mention the name of this book I'm urging our listeners to write this down or to remember it because you can get the book at no cost. And I'll come back, Kris, and explain what made that possible and why the decision was to make this an open access book. But Dimitri, let's begin with you. So you, along with Lauren Hale, edited this book that's entitled, The Handbook of Children and Screens: Digital Media Development and Wellbeing From Birth Through Adolescence. I think it's an extraordinary piece of work, but tell, tell us about the book.  It was an extraordinary undertaking. There's I think 178 or 180 authors. Literally, it's a who's who of experts in children and media research in all disciplines. It represents pediatrics, psychiatry, psychology, communications experts, demography, lawyers, neuroscientists. I don't know who I'm forgetting. Every single discipline is represented. Leading scientists in all of those areas. Virtually every topic that someone might be of interest to people. And we deliberately made the chapters short and easily accessible. So, it is, I think, a great resource for the constituents we serve. For teachers, for parents, for researchers, for policymakers. And it is free. The hardest part of it, to be honest, as an editor, was getting peer reviewers because unfortunately, every expert was conflicted since they all had an article in it. But it was a long time coming. And again, this was really the brainchild of Pam (Pamela Hurst-Della Pietra) and we're grateful to have brought it along. So, you go all the way from the neuroscience, how children's brains are reacting to this, all the way out there into the public policy and legal arena about what can be done about it. And then kind of everything in between. It's remarkable how much the book covers. It's almost a thousand pages. I mean, it is a tome to be sure. And don't forget to mention, Dimitri, we aren't even two months post publication, and we have 1.6 million views of the document, despite its gargantuan size. I think that is really a tribute to experts like you and others that have really studied this issue and can speak directly to its impacts. It's been great to see the success so far. You know, not a small number of those views is from me logging on. And then a million from me and then we got there. So, it is free because it's online and you can download it. You can also order a hard copy for I think, $60, but I'm not sure why you would do that if you can download it for free. But it's up to you. So, Kris, it's unusual for a book like this to be made open access and free to the general public. What made that possible and why was that so important? We want the maximum number of people to use it and treat it like the premier resource that it is. And the only way you can really do that is to fund it to be open access and find a publisher that does open access publishing, which we did with Springer. I mean, most journal articles are behind a paywall and publishers do require you to purchase either a subscription or the document itself to download it or order it. And we just really wanted maximum access. So, we funded it to be published in that way. And I think honestly, it helped us even sort of create it in the first place. People want to be a part of something that has that level of access and is available so widely. So, I think it was a kind of mutually beneficial. It gets more people to read it, but it got more people to write for it too, I think. Right, Dimitri? Dimitri - I agree. I mean, you know, the numbers 1.6 million are extraordinary. I mean, Kelly, you've been internal editor. I mean, as a editor of JAMA Pediatrics, if an article gets 70,000 views, it's in our top 1%, you know, 200,000 views is 0.01%. 1.6 million in growing is really extraordinary. And that's about the number of people that read my articles. 1.6. And of course, they're not all scientists. I mean, many of them are parents and maybe are policy makers, but that's Kris's point, you know. The moment anyone hits a paywall, even if it's a dollar or two, they're going to walk away. It's great to see it get so much traction. Alright, so again, for our listeners, the title of the book is The Handbook of Children and Screens. And it's really a terrific resource. Alright, so let's turn our attention to a really important matter. And we've sort of touched on this, but who's in charge of protecting our children? You know, Dimitri at the end of the day help survey this landscape for us. I mean, is it congress, is it the administrative branch of government? What role do the courts play? Are there legal actors taking meaningful action? What's being done does it come anywhere near, meeting the need. Tell us about what that landscape is like? Well, there isn't adequate protections for children. And we talked a little bit about that earlier. There's been an enormous loophole, unfortunately, created by Congress when they added the Section 230 to the Communications Decency Act in 1996. And that was put in place essentially to provide protections for internet companies. And it basically said that they should be treated like bookstores and not publishers. That they weren't responsible for content they were just conveying it. And what that means, in effect, was that the companies had sort of carte blanche to do whatever they want. And they've used that very effectively, legally, to argue that any restriction, any culpability on their part, is protected by that Act. That they're exonified for any ill that occurs as a result of their product. The only exception that's been made of it, to date, was around sex trafficking on back page, if anyone remembers that. But other than that, social media sites and internet sites in general have been able to say that they're not liable for anything that's done. And I think that was a huge mistake that was made. It needs to be rectified. It's being challenged in the courts presently. My own belief is that, and I'm not speaking as a lawyer, is that when that law was passed, it was under the assumption as I said, that they were just conveying information. No one at the time foresaw the development of algorithms that would feed the information. It's really not a bookstore when you are making recommendations. Once you start recommending things, I think you're no longer merely a purveyor of product. You're actually pushing it. So, Kris, tell us about the Children and Screens and the role the organization plays in this space. And how do you deal with policy and is it possible to be bipartisan? Yeah, I mean, it's essential. There's no way to get anything done, anywhere on these policy matters at a population level without working in a bipartisan or non-partisan manner, which is what we've always done. And it's easy to do that when you're following the science, not ideology. And you're putting the science first and you're creating resources and tools and support for those mostly staffers, honestly, that are trying to help their bosses get smarter and better at talking about these issues as they evolve and become more complicated over time. It takes more effort to staff a lawmaker on this front. And they're very anxious to learn and understand because they're meeting with parents of children who have been harmed. Or frankly didn't even survive their childhood because of the social media platform. There's great urgency on the part of policymakers. We've heard everything from school phone bans to outright social media bans proposed as policies. And one thing I like to come back to is it's one thing to want to take action and make your best guess at what would have the best impact. But it's another thing to study whether or not that policy actually achieved its result. And it's a part of this that by staying bipartisan, nonpartisan allows us to say, 'Hey lawmaker, if you're able to get that to happen, we'd really like to come in and help study whether or not your idea actually achieves the results that you wanted, or if it needs to be adjusted or amended over time.' Fantastic. That's so important to be doing that work, and I'm delighted the organization is doing it. Let me ask a question here. If you think about some of the areas of public health that I've been following, like tobacco, for example. Opioids more recently. Vaping products. And in the case of my own particular work food policy. The administrative legislative branches of government have been almost completely ineffective. If I think about food policy over the years, relatively little has been accomplished. Even though lots of people have worked really hard on it. Same thing happened with tobacco for many years. Opioids, same thing.  And it's until you get the third branch of government involved, the judiciary, and you start suing the actors who were causing the harm do you get much action. Not only do the lawsuits seem to have an effect, but they soften the ground for legislative things that then can occur because public opinion has changed. And then those things help make a difference as well. What do you think about that kind of issue in this space?  I think you're exactly right. I mean, I think the failure of our legislative branch to enact policy leaves us with very few options at this point anyway, except to try to pursue it through the judiciary. There are challenges there. First and foremost, it's a big and well-funded industry, not unlike tobacco or big food, as you mentioned and there's this Section 230 that's given them kind of blanket immunity to date. But there are many, many very large pending cases in several jurisdictions brought by individuals, brought by school districts, brought by states. And those, at least provisionally have gotten further than prior cases have with which have been thrown out based on Section 230. So, we'll see what happens with that litigation. But right now, my guess is it's the best chance we have to set some guardrails. And I think there are plenty of guardrails that could be set. Everything that these companies have done to make their products addictive can be undone. Can be made protective. The tobacco company deliberately designed their products to be addictive. While they tried to make the claims that they were less addictive, you know. They made light cigarettes that had holes in the filter so that it would diffuse the carbon and nicotine, but people quickly learned they could cover those up with their fingers and think they were smoking light cigarettes, and smoke more of them. There's a lot of things that can be done in this space to undesign the problematic nature of the products. And quite apart from the financial settlements, which will get companies attention, I hope that that's part of any settlement if it gets that far. It'll be interesting to see where those go. And, also historically, one important part of these lawsuits is what gets turned up in discovery. And what sort of intent the companies have and how much do they know about harms. And how much do they know about addiction and things like that. And how they might have proceeded in the face of that information that then doesn't get disclosed to the public. In any event, we'll see where that goes. Dimitri, what about the argument that responsibility resides with parents. It's up to parents to protect their kids from this, and government doesn't need to be involved. I've never understood that argument. I mean parents obviously are children's most important safeguard, but as a society, we enact policies and laws to assist parents in that. I mean to me, if I made the argument, well, why, why do we have minimum ages of drinking. It's parents' job to make sure their kids don't drink. How would that possibly play out? Look, it's hard enough as a parent anyway, because kids do get around these laws. But we still have them and it's a lot easier as a parent. I think most parents would agree their life's made easier by minimum age restrictions on certain things. We have seatbelt laws. I mean, why do we have seatbelt laws? Why don't we just tell its parents' job to make sure their kids buckle up? The truth is its society and parents working hand in hand to try and keep children safe. And I think it also helps parents to be able to say that there are laws around this, and I expect you to follow the laws. So, I don't think it's an either or. Okay, well, I think that's a very good way to frame it. There are many, many precedents where we protect children. And why not do it here too? So let me end with a question I'd like to ask both of you. So, in this sea of concerns that we've discussed, is there a reason for optimism? And Kris, let me start, start with you. What do you think? Absolutely. I think the young people I've met that are leading among their peers are incredibly impressive and are armed with the research and their energy and their own lived experience in ways that are very compelling. At the same time, I think the vast amount of research that has now been compiled and translated and acted upon, whether in courtrooms or in state houses, it's becoming more, and we're all getting more steeped and aware of more nuanced information. And finally, I would just say, there is a tipping point. We are reaching as a society, adults and kids alike, we are reaching a tipping point where we can't withstand the pressure of technology in every aspect, every corner of our day, our life. And we want relief. We deserve relief. And I think that's what's going to take us over the finish line. Good. Well, I'm glad to hear those optimistic notes. Dimitri, what about you? I can find reasons to be optimistic. I mean, look, the reality is that technologies have enriched our lives in many ways. And I think if we put guardrails in place, we can make sure that future ones do even better. I have a piece coming out in JAMA Pediatrics around the use of AI, which people are very concerned about, I think rightly. But specifically, about the use of AI and people with intellectual developmental disabilities, making the use case, that there are ways in which it could be extremely beneficial to that population. A population I care deeply about in my role as the Chief Health Officer at Special Olympics International. And in particular, let's say in terms of the doctor patient interaction where it could facilitate their communication with their provider, and it could also help the provider better communicate with them. Look, that use case isn't going to be a priority for the purveyors of artificial intelligence. It's a small, non-lucrative use of a technology. But it's a good one. And if we created the right incentives and put in the right guardrails, we could find many other ways that technology can serve the needs of all of us going forward. I think the problem is that we've tended to be reactive rather than proactive. And to not start with the do no harm first premise, particularly when it comes to children. AI is another example of that where I hope we don't make the same mistake we made with social media. Bios Kris Perry is the executive director of the Children and Screens Institute. Kris most recently served as Senior Advisor to Governor Gavin Newsom of California and Deputy Secretary of the California Health and Human Services Agency where she led the development of the California Master Plan for Early Learning and Care and the expansion of access to high-quality early childhood programs. She led systems change efforts at the local, state and national levels in her roles as executive director of First 5 San Mateo, First 5 California and of the First Five Years Fund. Through it all, Perry has fought to protect children, improve and expand early learning programs, and increase investments in low-income children. Perry was instrumental in returning marriage equality to California after the landmark 2013 U.S. Supreme Court ruling Hollingsworth v. Perry, which she wrote about in her book Love on Trial (Roaring Forties Press, 2017). Dimitri Christakis, MD, MPH is the Children and Screens Institute's inaugural Chief Science Officer. He is also the George Adkins Professor at the University of Washington, Editor in Chief of JAMA Pediatrics, and the Chief Health Officer at Special Olympics International. Christakis is a leading expert on how media affects child health and development. He has published over 270 peer reviewed articles (h-index 101) including dozens of media-related studies and co-authored a groundbreaking book, The Elephant in the Living Room: Make Television Work for Your Kids. His work has been featured on Anderson Cooper 360, the Today Show, ABC, NBC, and CBS news as well as all major national newspapers. Christakis received his undergraduate degree at Yale University and his medical training at the University of Pennsylvania School of Medicine and completed his residency and Robert Wood Johnson Clinical Scholar Fellowship at the University of Washington School of Medicine. 

Sexploitation
The Dirty Dozen List Presents: Section 230 of the Communications Decency Act

Sexploitation

Play Episode Listen Later Apr 3, 2025 46:08


Haley McNamara (NCOSE Senior VP of Programs and Initiatives) and Dani Pinter (Senior VP and Director at the NCOSE Law Center) talk about Section 230 of the Communications Decency Act (CDA) and why it's essential for it to end. They also discuss the history of the Dirty Dozen List and what led to this unique version of the list in 2025. Since its inception in 1996, Section 230 has effectively provided blanket immunity to big tech companies for harms facilitated on their platforms. It's time to call for a full repeal of CDA Section 230! Learn more and take action here: www.DirtyDozenList.org Watch the video version of this episode here: https://youtu.be/G7VZVJ1QRUc

The Just Security Podcast
The Just Security Podcast: Regulating Social Media — Is it Lawful, Feasible, and Desirable? (NYU Law Forum)

The Just Security Podcast

Play Episode Listen Later Mar 26, 2025 72:24 Transcription Available


2025 will be a pivotal year for technology regulation in the United States and around the world. The European Union has begun regulating social media platforms with its Digital Services Act. In the United States, regulatory proposals at the federal level will likely include renewed efforts to repeal or reform Section 230 of the Communications Decency Act. Meanwhile, States such as Florida and Texas have tried to restrict content moderation by major platforms, but have been met with challenges to the laws' constitutionality.  On March 19, NYU Law hosted a Forum on whether it is lawful, feasible, and desirable for government actors to regulate social media platforms to reduce harmful effects on U.S. democracy and society with expert guests Daphne Keller, Director of the Program on Platform Regulation at Stanford Law School's Cyber Policy Center, and Michael Posner, Director of the Center for Business and Human Rights at NYU Stern School of Business. Tess Bridgeman and Ryan Goodman, co-editors-in-chief of Just Security, moderated the event, which was co-hosted by Just Security, the NYU Stern Center for Business and Human Rights and Tech Policy Press. Show Notes: Tess Bridgeman Ryan GoodmanDaphne Keller Michael PosnerJust Security's coverage on Social Media PlatformsJust Security's coverage on Section 230Music: “Broken” by David Bullard from Uppbeat: https://uppbeat.io/t/david-bullard/broken (License code: OSC7K3LCPSGXISVI)

2 Girls 1 Podcast
45 Shocker: Mental Health Information On TikTok Is Wildly Inaccurate (and Might Make You Feel Worse)

2 Girls 1 Podcast

Play Episode Listen Later Mar 26, 2025 35:27


A peer reviewed study from the University of British Columbia analyzed the most popular ADHD-related videos on TikTok and found that very few of them contain information that aligns with medical diagnostics. Even more interesting, people who consume this kind of content perceive the diagnosis to be far more widespread - and symptoms far more intense - than in reality. Social media is warping our sense of reality and self?! Inconceivable! Plus: New bi-partisan legislation aims to repeal Section 230 of the Communications Decency Act. The 1996 law shields websites from litigation if their users upload harmful content. This protection is integral to how the modern Web functions, and erasing it could lead to over-moderation and fear-based censorship. While harmful user content is always a concern, the true problem with social media is not that the content exists, but its weaponization by tech companies in the name of profit. In this case, Congress is fixing the wrong problem, and the consequences could damage online speech for years to come. A double-edged hashtag: Evaluation of #ADHD-related TikTok content and its associations with perceptions of ADHD: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0319335&_bhlid=eba84bf6525fc0863f3276ea4a45eb0891dac351 TikTok is full of ADHD advice — just don't trust it for a diagnosis: https://www.npr.org/2025/03/24/nx-s1-5336303/adhd-symptoms-adult-tiktok Lawmakers are trying to repeal Section 230 again: https://www.theverge.com/news/634189/section-230-repeal-graham-durbin This show is made possible by listener support: https://www.patreon.com/influencepod Listen & subscribe wherever you get podcasts:

The Tech Jawn
To 230 Or Not To 230

The Tech Jawn

Play Episode Listen Later Mar 25, 2025 59:52


In this episode of The Tech Jawn, we discuss…23AndMe filing for bankruptcy, a facial recognition company tried to buy social security numbers for its database, and several Senators want to repeal Section 230 of the Communications Decency Act.Hosts:Robb Dunewood – @RobbDunewoodStephanie Humphrey – @TechLifeStephTerrance Gaines – @BrothaTechStories Mentioned:-- 23AndMe Files for Chapter 11 Bankruptcy -- The Verge-- Facial Recognition Company Clearview Attempted to Buy Social Security Numbers and Mugshots for its Database -- 404 Media-- Lawmakers want to repeal Section 230. Is this a good or bad thing? -- The VergeSupport The Tech Jawn by becoming a Patron – https://thetechjawn.com/patreon Hosted on Acast. See acast.com/privacy for more information.

The Ben Joravsky Show
Jim Coogan—TikTok Turmoil

The Ben Joravsky Show

Play Episode Listen Later Jan 18, 2025 62:20


Fox has an interesting interpretation of objective news presentation. Ben riffs. Jim Coogan talks TikTok. Which brings him to the all-important Section 230 of the Communications Decency Act. The world would look a lot different if 230 did not exist, that's for sure. Also, Trump's unconditional discharge. Ok, stop snickering, people. It's not what you think. And the ethics of the Alito-Trump phone call. Jim is an attorney with Coogan Gallagher. His views are his own.See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Project 2025: The Ominous Specter
Radical Transformation or Dangerous Power Grab? The Debate Over Project 2025's Vision for America

Project 2025: The Ominous Specter

Play Episode Listen Later Jan 12, 2025 6:39


As I delved into the intricacies of Project 2025, a comprehensive policy initiative crafted by the Heritage Foundation, I was struck by the sheer scope and ambition of its proposals. This 900-page blueprint, released in April 2023, outlines a radical transformation of American governance, touching on virtually every aspect of federal policy, from education and healthcare to technology and environmental regulation.At its core, Project 2025 is a manifesto for a conservative revolution, envisioning a federal government reshaped in the image of a strong, centralized executive branch. The project's architects argue that the current system of independent federal agencies undermines the democratic republic, and they propose placing the entire executive branch under direct presidential control. As Heritage Foundation President Kevin Roberts puts it, "The notion of independent federal agencies or federal employees who don't answer to the president violates the very foundation of our democratic republic."[1]One of the most striking aspects of Project 2025 is its plan to dismantle and reconfigure several key federal agencies. The Department of Education, for instance, would be abolished, with its programs either transferred or terminated. Education would be left to the states, with federal funding for low-income students, such as Title I of the Elementary and Secondary Education Act of 1965, allowed to expire. Instead, public funds would be channeled into school vouchers, even for private or religious schools, reflecting the project's belief that education is a private rather than a public good[1].The Department of Homeland Security would also face significant changes, with Project 2025 advocating for its dismantling. This move is part of a broader strategy to reshape national security and immigration policies, including the arrest, detention, and mass deportation of illegal immigrants and the deployment of the military for domestic law enforcement[1].In the realm of healthcare, Project 2025 proposes drastic cuts to Medicare and Medicaid, and it urges the government to explicitly reject abortion as healthcare. The plan also seeks to eliminate coverage of emergency contraception and use the Comstock Act to prosecute those who send and receive contraceptives and abortion pills. This stance is part of a broader agenda to roll back reproductive rights and impose conservative moral values on healthcare policy[1].The project's vision for science and research is equally transformative. It prioritizes fundamental research over applied research, arguing that many current programs act as subsidies to the private sector. Climate change research would be significantly curtailed, with the U.S. Global Change and Research Program facing critical analysis and potential rejection of its assessments prepared under the Biden administration. The Environmental Protection Agency (EPA) would be restricted from using "unrealistic" projections of climate change impacts, and its science activities would require clear congressional authorization[4].Project 2025 also targets the tech and media landscape, proposing significant reforms to the Federal Communications Commission (FCC) and the Federal Trade Commission (FTC). The plan includes increasing agency accountability, reducing wasteful spending, and promoting national security and economic prosperity. It suggests that Big Tech companies should contribute to the Universal Service Fund, currently funded through telephone bills, to support the expansion of 5G and satellite connectivity. Additionally, the project advocates for revising Section 230 of the Communications Decency Act, limiting social media's ability to moderate content and ban individuals from their platforms[3].The implications of these proposals are far-reaching and have sparked intense debate. Critics argue that Project 2025 represents a blueprint for an autocratic takeover, undermining the system of checks and balances that is foundational to American democracy. As one analysis notes, "Project 2025 would destroy the U.S. system of checks and balances and create an imperial presidency," giving the president almost unlimited power to implement policies without significant oversight[5].The project's stance on civil rights is particularly contentious. It rejects diversity, equity, inclusion, and accessibility (DEIA) initiatives as "managerialist left-wing race and gender ideology" and proposes banning funding for critical race theory. The Department of Justice and the Equal Employment Opportunity Commission would be used to undermine protections for LGBTQ employees and to prosecute private employers that support DEIA in their workplaces[2].As I navigated the complex web of policies outlined in Project 2025, it became clear that this initiative is not just a collection of policy proposals but a coherent vision for a fundamentally different America. The project's backers see it as a necessary corrective to what they perceive as the "totalitarian cult" of the "Great Awokening," a term they use to describe the cultural and political shifts of recent years[3].Despite Donald Trump's attempts to distance himself from the project, many of its authors have close ties to his administration. The connection is evident in the overlap between Project 2025's recommendations and Trump's own policy agenda. For example, Trump has called for NPR funding to be rescinded, echoing Project 2025's criticism of public broadcasting as a "liberal disinformation machine"[3].As the 2025 presidential transition approaches, the potential implementation of Project 2025's policies looms large. The project's authors envision an "army of aligned, vetted, trained, and prepared" personnel ready to execute these reforms swiftly. If realized, these changes would mark a seismic shift in American governance, one that could redefine the balance of power between the executive branch and other institutions of government.In the coming months, as the political landscape continues to evolve, the fate of Project 2025 will remain a critical point of contention. Whether its proposals are adopted in whole or in part, one thing is certain: the initiative has already sparked a national conversation about the future of American democracy and the role of the federal government in shaping that future. As we move forward, it will be essential to closely monitor these developments and consider the profound implications they hold for the country's governance, civil rights, and societal values.

So to Speak: The Free Speech Podcast
Ep. 233: Rethinking free speech with Peter Ives

So to Speak: The Free Speech Podcast

Play Episode Listen Later Jan 9, 2025 81:06


Is the free speech conversation too simplistic?  Peter Ives thinks so. He is the author of “Rethinking Free Speech,” a new book that seeks to provide a more nuanced analysis of the free speech debate within various domains, from government to campus to social media. Ives is a professor of political science at the University of Winnipeg. He researches and writes on the politics of “global English," bridging the disciplines of language policy, political theory, and the influential ideas of Antonio Gramsci. Enjoying our podcast? Donate to FIRE today and get exclusive content like member webinars, special episodes, and more. If you became a FIRE Member through a donation to FIRE at thefire.org and would like access to Substack's paid subscriber podcast feed, please email sotospeak@thefire.org. Read the transcript. Timestamps:  00:00 Intro 02:25 The Harper's Letter 05:18 Neil Young vs. Joe Rogan 08:15 Free speech culture 09:53 John Stuart Mill 12:53 Alexander Meiklejohn 17:05 Ives's critique of Jacob Mchangama's “History of Free Speech” book 17:53 Ives's definition of free speech 19:38 First Amendment vs. Canadian Charter of Rights 21:25 Hate speech 25:22 Canadian Charter and Canadian universities 34:19 White supremacy and hate speech 40:14 Speech-action distinction 46:04 Free speech absolutism 48:49 Marketplace of ideas 01:05:40 Solutions for better public discourse 01:13:02 Outro  Show notes: The Canadian Charter of Rights and Freedoms (1982) “A Letter on Justice and Open Debate” Harper's Magazine (2020) “On Liberty” John Stuart Mill (1859) “Free Speech: A History from Socrates to Social Media” Jacob Mchangama (2022) Brandenburg v. Ohio (1969) Mahanoy Area School District v. B.L. (2021) Canadian Criminal Code (1985) Bill C-63 - An Act to enact the Online Harms Act (2024) McKinney v. University of Guelph (1990) “When is speech violence?” The New York Times (2017) Section 230 (Communications Decency Act of 1996)

Keen On Democracy
Episode 2290: Marshall Poe on why 2024 was a bad year for most podcasters

Keen On Democracy

Play Episode Listen Later Jan 1, 2025 40:44


Marshall Poe runs the New Books Network, a podcasting platform incorporating over 25,000 individual podcasts from thousands of podcasters and many millions of downloads. 2024, he acknowledges, was a bad year for podcasting because Apple changed their metrics so that the audience numbers for most podcasts fell precipitously overnight. And 2025, he suggests, probably isn'g going to be much better with winner-take-all podcasters like Joe Rogan hogging most of the audience and profits. How could the internet be made more democratic again so that podcasters on platforms like the New Book Network and entrepreneurs like Marshall Poe can make a living from their work? Poe isn't particularly hopeful, but suggests that a reform of Section 230 of the Communications Decency Act of 1996 might represent a beginning to restoring the leveling promise of the digital revolution. Marshall Tillbrook Poe is an American historian, writer, editor, and founder of the New Books Network, an online collection of podcast interviews with a wide range of nonfiction authors. He has taught Russian, European, Eurasian, and world history at various universities including Harvard, Columbia, University of Iowa, and, currently, the University of Massachusetts Amherst. Poe is the author or editor of a number of books for children and adults.Named as one of the "100 most connected men" by GQ magazine, Andrew Keen is amongst the world's best known broadcasters and commentators. In addition to presenting KEEN ON, he is the host of the long-running How To Fix Democracy show. He is also the author of four prescient books about digital technology: CULT OF THE AMATEUR, DIGITAL VERTIGO, THE INTERNET IS NOT THE ANSWER and HOW TO FIX THE FUTURE. Andrew lives in San Francisco, is married to Cassandra Knight, Google's VP of Litigation & Discovery, and has two grown children.Keen On is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber. This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit keenon.substack.com/subscribe

Podcast on Crimes Against Women
Reforming Laws to Protect Against Digital Violence & Image-Based Sexual Abuse

Podcast on Crimes Against Women

Play Episode Listen Later Dec 9, 2024 40:55 Transcription Available


Brace yourself for a candid conversation about the urgent and often hidden issues of violence and digital abuse against women. Dani Pinter, Senior Legal Counsel for the National Center on Sexual Exploitation, joins us to pull back the curtain on the alarming reality of non-consensual sexual exploitation online. We confront shocking statistics and explore the staggering impact of these crimes on victims, as Dani shares insights into the history and mission of her organization to combat these deeply entrenched societal problems.Our discussion takes a hard look at image-based sexual abuse and the complex landscape of legal accountability surrounding platforms like Backpage and Pornhub. We draw parallels between past legal actions against industries like big tobacco and current efforts to hold websites responsible for enabling exploitation. The conversation highlights the alarming rise of voyeuristic content and fake sexual images created by AI, as we unravel the challenges of curbing these offenses in the digital age and the significant implications for privacy and consent.Navigating the murky waters of online accountability, we tackle the barriers victims face as they seek justice and content removal. The conversation zeroes in on Section 230 of the Communications Decency Act, emphasizing the urgent need for legal reform to empower victims and hold tech companies accountable. We also spotlight the progress made in addressing image-based sexual abuse, including amendments to the Violence Against Women Act, marking key steps toward more robust legal protections. Join us as we chart a path forward in advocating for victims and challenging societal norms and institutional practices.

Masters of Privacy
Lukasz Olejnik: Propaganda, misinformation, the DSA, Section 230, and the US elections

Masters of Privacy

Play Episode Listen Later Nov 3, 2024 28:30


Dr Lukasz Olejnik (@lukOlejnik), LL.M, is an independent cybersecurity, privacy and data protection researcher and consultant. Senior Visiting Research Fellow of the Department of War Studies, King's College London. He holds a Computer Science PhD at INRIA (French Institute for Research in Digital Science and Technology), and LL.M. from University of Edinburgh. He worked at CERN (European Organisation for Nuclear Research), and was a research associate at University College London. He was associated with Princeton's Center for Information Technology Policy, and Oxford's Centre for Technology and Global Affairs. He was a member of the W3C Technical Architecture Group. Former cyberwarfare advisor at the International Committee of the Red Cross in Geneva, where he worked on the humanitarian consequences of cyber operations. Author of scientific articles, op-eds, analyses, and books Philosophy of Cybersecurity, and “Propaganda”. He contributes public commentary to international media. References: Full interview transcript (on Medium) Propaganda, by Lukasz Olejnik Lukasz Olejnik on Cyber, Privacy and Tech Policy Critique (Newsletter) Lukasz Olejnik on Mastodon Lukasz Olejnik on X EU Digital Services Act (DSA)  Section 230 (“Protection for private blocking and screening of offensive material“)  of the Communications Decency Act (1996) Cubby, Inc. v. CompuServe Inc. and Stratton Oakmont, Inc. v. Prodigy Services Co. as precursors to Section 230 Doppelganger in action: Sanctions for Russian disinformation linked to Kate rumours EU takes shot at Musk over Trump interview — and EU takes shot at Musk over Trump interview — and misses (Politico) The story of Pavel Rubtsov (“Journalist or Russian spy? The strange case of Pablo González”), The Guardian Silicon Valley, The New Lobbying Monster (mentioning Chris Lehane's campaigns), The New Yorker Financial Times: Clip purporting to show a Haitian voting in Georgia is among ‘Moscow's broader efforts' to sway the race “Pseudo-media”:  Spain proposes tightening rules on media to tackle fake news  

So to Speak: The Free Speech Podcast
Ep. 228: Does artificial intelligence have free speech rights?

So to Speak: The Free Speech Podcast

Play Episode Listen Later Nov 1, 2024 70:43


In this live recording of “So to Speak” at the First Amendment Lawyers Association meeting, Samir Jain, Andy Phillips, and Benjamin Wittes discuss the legal questions surrounding free speech and artificial intelligence. Samir Jain is the vice president of policy at the Center for Democracy and Technology. Andy Phillips is the managing partner and co-founder at the law firm Meier Watkins Philips and Pusch. Benjamin Wittes is a senior fellow in governance studies at the Brookings Institution and co-founder and editor-in-chief of Lawfare. Read the transcript. Timestamps:  00:00 Intro 01:54 The nature of AI models 07:43 Liability for AI-generated content 15:44 Copyright and AI training datasets 18:45 Deepfakes and misinformation 26:05 Mandatory disclosure and AI watermarking 29:43 AI as a revolutionary technology 36:55 Early regulation of AI  38:39 Audience Q&A 01:09:29 Outro Show notes: -Court cases: Moody v. NetChoice (2023) The New York Times Company v. Microsoft Corporation, et al (2023) Millette v. OpenAI, Inc (2024) Walters v. OpenAI, L.L.C. (2024) -Legislation: Section 230 (Communications Decency Act of 1996) AB 2839 - Elections: deceptive media in advertisements AB 2655 - Defending democracy from deepfake deception Act of 2024 California AI transparency Act  Colorado AI Act NO FAKES Act of 2024  -Articles: “A machine with First Amendment rights,” Benjamin Wittes, Lawfare (2023) “22 top AI statistics and trends in 2024,” Forbes (2024) “Global risks 2024: Disinformation tops global risks 2024 as environmental threats intensify,” World Economic Forum (2024) “Court lets first AI libel case go forward,” Reason (2024) “CYBERPORN - EXCLUSIVE: A new study shows how pervasive and wild it really is. Can we protect our kids – and free speech?” TIME (1995) “It was smart for an AI,” Lawfare (2023)

Scrolling 2 Death
Snapchat Filters That Kill (with attorney Mike Neff)

Scrolling 2 Death

Play Episode Listen Later Oct 28, 2024 34:00


In this conversation, Nicki Reisberg interviews Michael Lawson Neff, a founding partner at Neff Injury Law, about the legal implications of online harms, particularly focusing on the case against Snapchat related to its deadly speed filter. They discuss the challenges of holding tech companies accountable under Section 230 of the Communications Decency Act, the impact of social media on youth, and the importance of parental guidance in navigating these issues. Neff shares insights from his ongoing litigation and emphasizes the need for legislative changes to protect children from online dangers. This episode is sponsored by Bark Technologies. Learn about the Bark App for iPhones and Androids: *Use code SCROLLING2DEATH FOR 10% OFF Check out the Bark Phone Learn about the Bark Watch Meet Michael Neff: Michael Neff is the founder of Neff Injury Law, a national litigation firm based in Atlanta, Georgia, specializing in serious injury and wrongful death cases. With over 25 years of experience, Mike has established himself as a dedicated advocate for individuals facing off against large corporations and insurance companies. His firm's track record includes securing over $100 million in jury verdicts and settlements, with notable verdicts in 2013 and 2017 that ranked among the top 100 civil verdicts nationally for those years. --- Support this podcast: https://podcasters.spotify.com/pod/show/scrolling2death/support

Minimum Competence
Legal News for Weds 10/16 - Meta Faces Claims by 34 States, SCOTUS Rejects Uber's Challenge to CA Labor Law, Swift Election Litigation and Stolen Tax Refund Checks

Minimum Competence

Play Episode Listen Later Oct 16, 2024 6:45


This Day in Legal History: Nazi War Criminals HangedOn October 16, 1946, ten high-ranking Nazi war criminals were executed by hanging after being convicted by the International Military Tribunal at Nuremberg. This landmark trial held key figures of Adolf Hitler's regime accountable for crimes against humanity, war crimes, and genocide committed during World War II. Among those executed was Joachim von Ribbentrop, the former German Foreign Minister, who had played a significant role in Nazi diplomacy, including the negotiation of the non-aggression pact with the Soviet Union. Others included Wilhelm Keitel, head of the German Armed Forces, and Alfred Jodl, a top military strategist.The Nuremberg trials were a historic moment in international law, establishing the precedent that individuals—even heads of state and military leaders—could be held criminally responsible for war crimes. The tribunal addressed the atrocities of the Holocaust, the invasion of neighboring countries, and the brutal treatment of civilians and prisoners of war. The executions followed months of legal proceedings and were seen as a step toward justice for millions of victims. Two of the condemned, Hermann Göring and Martin Bormann, avoided the gallows—Göring by committing suicide the night before the executions, and Bormann being sentenced in absentia, as he was never captured. These trials helped shape modern principles of international law, including the concepts of crimes against humanity and the rule of law in war. Meta Platforms Inc. must face claims by 34 state attorneys general accusing the company of contributing to a youth mental health crisis by getting children hooked on Facebook and Instagram. A federal judge in California ruled that some claims in the lawsuit could proceed, while others were dismissed under Section 230 of the Communications Decency Act, which shields internet companies from liability over user-generated content. The states allege Meta's platforms cause mental health issues, like depression, in young users and that the company unlawfully collected data from children under 13. The lawsuit is part of a broader legal push against social media companies like TikTok, YouTube, and Snap, all of which are accused of profiting from the addiction of young users. Meta's spokesperson defended the company's actions, pointing to tools for parental controls and recent changes to Instagram's teen accounts. However, the judge noted that Meta's alleged “public campaign of deception” about the dangers of social media addiction could violate state and federal laws. The ruling also allows claims challenging features like “appearance-altering filters” but limits challenges to infinite scroll and likes. The decision comes alongside similar lawsuits by public school districts alleging social media companies create a public nuisance.Meta Can't Escape States' Claims It Hooked Kids on Platforms (4)The US Supreme Court declined to revive a challenge by Uber and Postmates to California's employment classification law, AB 5, leaving in place a Ninth Circuit ruling. AB 5 requires most workers to be classified as employees, giving them broader protections and benefits compared to independent contractors. Although Uber and other app-based companies are exempt from AB 5 under Proposition 22, which voters approved in 2020, they faced penalties for alleged violations before Prop 22 took effect.Uber and Postmates argued that AB 5 unfairly targeted their industries, claiming the law violated their equal protection rights by exempting other sectors. However, the Ninth Circuit ruled that lawmakers had rational reasons for distinguishing between industries, suggesting that ride-hailing companies were perceived as larger contributors to worker misclassification. The companies petitioned the Supreme Court, but the justices allowed the lower court's decision to stand, effectively ending their constitutional challenge to the law.Supreme Court Stymies Uber's Challenge to California Labor LawCourts in key battleground states are implementing procedures to expedite election-related lawsuits ahead of the November 2024 election to avoid delays in finalizing results. Arizona's Supreme Court recently ordered trial courts to prioritize election disputes, ensuring any challenges, such as those concerning recounts or presidential electors, are resolved quickly. This comes as both Republicans and Democrats have filed numerous lawsuits ahead of the election, and experts predict more legal battles on Election Day over vote counting and certification.Similar measures have been adopted in other battleground states, including Pennsylvania, which shortened the timeframe for appeals to three days, and Michigan, which introduced protocols for handling emergency election-related rulings. These actions are seen as a proactive response to the legal chaos of the 2020 election, when former President Donald Trump and his allies unsuccessfully challenged results with claims of widespread voter fraud. Courts are also preparing for potential security risks, with warnings of increased threats to judges during periods of national tension. Legal experts praise these steps as a way to ensure smooth and timely election litigation.Courts in US battleground states move to swiftly decide election cases | ReutersMy column for Bloomberg this week discusses how the IRS can solve the issue of stolen tax-refund checks, increasingly a major issue, by embracing technology. Despite the availability of direct deposit, many taxpayers still rely on paper checks, which are vulnerable to theft. I argue that the IRS should offer secure digital refund cards, similar to the electronic benefits transfer (EBT) cards used in welfare programs, for taxpayers without bank accounts. These cards can be mailed securely, with separate deliveries for the card and its PIN, reducing theft risks.For those who prefer physical checks, I suggest allowing taxpayers to pick them up at secure locations like post offices, where the checks could be activated upon identity verification. This method would work like a software kill switch for smartphones, rendering checks useless if stolen before activation. Additionally, an optional mobile app could provide tracking, security, and refund management features for tech-savvy taxpayers.These solutions would enhance refund security while ensuring flexibility and accessibility. The IRS should also collaborate with local organizations to help taxpayers navigate these new systems, ensuring no one is left behind in the transition to a more secure refund process.Secure Digital Tax Refund System Can Solve Stolen Check Problem This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit www.minimumcomp.com/subscribe

So to Speak: The Free Speech Podcast
Ep. 226: ‘Shouting fire,' deepfake laws, tenured professors, and mask bans

So to Speak: The Free Speech Podcast

Play Episode Listen Later Oct 10, 2024 65:35


The FIRE team discusses Tim Walz's controversial comments on hate speech and “shouting fire in a crowded theater.” We also examine California's AI deepfake laws, the punishment of tenured professors, and mask bans.   Joining us are: Aaron Terr, FIRE's director of Public Advocacy; Connor Murnane, FIRE's Campus Advocacy chief of staff; and Adam Goldstein, FIRE's vice president of strategic initiatives.   Read the transcript.   Timestamps: 00:00 Intro 01:51 Tim Walz's comments on hate speech and “shouting fire” 15:36 California's AI deepfake laws 32:05 Tenured professors punished for expression 54:27 Nassau County's mask ban 1:04:39 Outro   Show notes: Court cases: Schenck v. United States (1919) Brandenburg v. Ohio (1969) National Socialist Party of America v. Village of Skokie (1977) Texas v. Johnson (1989) Snyder v. Phelps (2011) Matal v. Tam (2017) Virginia v. Black (2003) NAACP v. Alabama (1958) Kohls v. Bonta (this suit challenges the constitutionality of AB 2839 and AB 2655) (2024) G.B. et al. v. Nassau County et al. (this class action lawsuit alleges Nassau County's Mask Transparency Act is unconstitutional and discriminates against people with disabilities) (2024) Legislation: AB 2839  AB 2655 AB 1831 Title VI (Civil Rights Act of 1964) Section 230 (Communications Decency Act of 1996) Articles/Tweets: “This is amazing

The Majority Report with Sam Seder
2318 - 63 Days Until Harris V. Trump & The Ruling That May Change Social Media Forever

The Majority Report with Sam Seder

Play Episode Listen Later Sep 3, 2024 89:42


It's News Day Tuesday! Sam is back from vacation, and he and Emma break down the biggest headlines from over the long weekend. First, they run through updates on the final 9 weeks before the US presidential election, Russia's bombardment on Ukraine, domestic dissent to Israel's expanding offensive as they turn to the West Bank, Unite Here labor action, Trump's Arlington fiasco, RFK's ballot fiasco, and Venezuela's political crisis, before parsing through JD Vance's Dimaggio-esque run of the most clinically insane misogyny. Next, they dive into updates on the presidential race with the election just 63 days out, tackling Biden's backdown bump and RFK's failed attempt to remove himself from swing-state ballots before parsing deeper through swing-state polling as the Harris campaign comes into full swing. Next, Sam and Emma look to a devastating and dominant decision coming out of the Third Circuit, with TikTok's role in boosting an adolescent social media “Blackout” challenge resulting in the accidental death of a 10-year-old and producing the first successful challenge to Section 230 of the 1996 Communications Decency Act that established social media platforms as third-party speech, finally allowing for social media megacorporations to be held to account for their active curation of the speech that occurs on their sites. They then look to Israel's shifting fronts in their ethnic cleansing of Palestine, first unpacking this weekend's news about the death of six hostages in Palestinian captivity, and the underrecognized role that both Biden and Bibi's uninhibited bloodlust played in making these murders possible, wrapping up the first half of the show by diving deep into Donald Trump's eagerness to take on Zionist money in his continued commitment to the erasure of Palestine, including a plan to fund the annexation of the West Bank, and exploring what David Friedman's take on the issue can illuminate for us all. And in the Fun Half: Sam and Emma have an expansive conversation with John from San Antonio with the 2024 US Presidential Election some nine weeks away, tackling historical parallels, the politics of polling, and the swing-state situation with polls. Brett Cooper (aka Femme Shapiro) celebrates the seminal moment of a conspiracy-backed grifter endorsing another conspiracy-backed grifter, Adam from New Hampshire unpacks AOC's critique of the Green Party and helps the show dive deep into the idea of Harris separating herself from Biden's zionist bloodlust, plus, your calls and IMs! Donate IF YOU CAN to friend of the show Mohamed Aldaghma's Gaza Bakery project to help displaced families: https://www.gofundme.com/f/gaza-bakery-feeding-displaced-families Check out the LIMITED EDITION Vergogna shirt on the MR shop!: https://shop.majorityreportradio.com/collections/all-items/products/the-majority-report-vergogna-t-shirt Check out Tony Y, who designed the Vergogna shirt's website!: https://linktr.ee/tonyyanick AND! Check out Anne from Portland's website for HER Vergogna t-shirt! INQUIRE MORE HERE FOR DETAILS!: https://www.bonfire.com/store/pictrix-design/ Become a member at JoinTheMajorityReport.com: https://fans.fm/majority/join Follow us on TikTok here!: https://www.tiktok.com/@majorityreportfm Check us out on Twitch here!: https://www.twitch.tv/themajorityreport Find our Rumble stream here!: https://rumble.com/user/majorityreport Check out our alt YouTube channel here!: https://www.youtube.com/majorityreportlive Join Sam on the Nation Magazine Cruise! 7 days in December 2024!!: https://nationcruise.com/mr/ Check out StrikeAid here!; https://strikeaid.com/ Gift a Majority Report subscription here: https://fans.fm/majority/gift Subscribe to the ESVN YouTube channel here: https://www.youtube.com/esvnshow Subscribe to the AMQuickie newsletter here: https://am-quickie.ghost.io/ Join the Majority Report Discord! http://majoritydiscord.com/ Get all your MR merch at our store: https://shop.majorityreportradio.com/ Get the free Majority Report App!: http://majority.fm/app Check out today's sponsors: Express VPN: Protect your online privacy TODAY by visiting https://ExpressVPN.com/majority. That's https://ExpressVPN.com/majority and you can get an extra three months FREE.  Sunset Lake CBD: Visit https://SunsetLakeCBD.com before September 9th and use code LABOR to save 30% on their CBD tinctures and participating bundles. See their website for terms and conditions. Follow the Majority Report crew on Twitter: @SamSeder @EmmaVigeland @MattLech @BradKAlsop Check out Matt's show, Left Reckoning, on Youtube, and subscribe on Patreon! https://www.patreon.com/leftreckoning Check out Matt Binder's YouTube channel: https://www.youtube.com/mattbinder Subscribe to Brandon's show The Discourse on Patreon! https://www.patreon.com/ExpandTheDiscourse Check out Ava Raiza's music here! https://avaraiza.bandcamp.com/ The Majority Report with Sam Seder - https://majorityreportradio.com/

The Truth with Lisa Boothe
The Truth with Lisa Boothe: The Greatest Threats to National Security in the Digital Age with Kara Frederick

The Truth with Lisa Boothe

Play Episode Listen Later Sep 2, 2024 27:17 Transcription Available


In this episode, Lisa welcomes Kara Frederick, director of tech policy at the Heritage Foundation, to discuss pressing issues. They delve into the assassination attempt on Donald Trump, cyber attacks linked to the Chinese government, and Mark Zuckerberg's letter revealing government pressure on Facebook to censor content. Frederick provides expert insights on encrypted accounts, the nature of cyber threats, and the implications of Section 230 for big tech. The Truth with Lisa Boothe is part of the Clay Travis & Buck Sexton Podcast Network - new episodes debut every Monday & Thursday.See omnystudio.com/listener for privacy information.

Unchained
If the SEC Sues OpenSea, Here's Why the NFT Platform Could Win Easily - Ep. 696

Unchained

Play Episode Listen Later Aug 30, 2024 36:42


The SEC's latest enforcement action is targeting NFTs, and OpenSea is in the crosshairs. In this episode, crypto lawyer Preston Byrne joins to unpack the implications of the SEC's Wells Notice to OpenSea and what it might mean for the platform and the broader NFT market. Could Section 230 of the Communications Decency Act provide a unique defense for OpenSea? Preston also dives into other recent SEC moves, including cases against Stoner Cats, Impact Theory, and more.  Lastly, with the 2024 elections looming and political divides sharpening, is the SEC overreaching in its approach to crypto? Show highlights: Why Preston believes that the SEC will go after OpenSea for being an unregistered securities exchange What the Stoner Cats case was about and why it was not a strong enforcement action, according to Preston Why OpenSea's defense against the SEC may hinge on Section 230 protections for user-generated content, setting it apart from traditional exchanges like Coinbase or Binance How the clear-cut promises made by Impact Theory about potential returns made their NFTs resemble securities, unlike the typical art-focused NFTs on OpenSea Why Nate Chastain's NFT insider trading case is unlikely to impact the SEC's potential lawsuit against OpenSea Whether the $4 million settlement by Dapper Labs over NBA Top Shot NFTs likely represents little relevance to OpenSea's SEC issues What a Wells notice signals about the SEC's likelihood of suing OpenSea and why they might feel confident about winning this case How Jonathan Mann and Brian Frye's lawsuit for clarity on NFTs as securities highlights the SEC's potentially overreaching stance in its possible case against OpenSea How Trump's careful language around his NFT collection likely minimizes SEC risk by avoiding investment promises and focusing on their use as digital collectibles Whether the SEC's actions could reinforce the divide among crypto voters, with Trump promising a crypto-friendly stance and Harris likely continuing a more adversarial approach Visit our website for breaking news, analysis, op-eds, articles to learn about crypto, and much more: unchainedcrypto.com Thank you to our sponsors! iTrustCapital Polkadot Token 2049 Mantle's FBTC Gemini Guest Preston Byrne, Managing Partner at Byrne & Storm Links Wells notice Original announcement by X by OpenSea's CEO Devin Finzer Unchained: OpenSea's Wells Notice From the SEC Could Prove ‘Disastrous' Recent cases Unchained: Are NFTs Securities Offerings? Two Artists Sue the SEC to Find Out The Defiant: NFTs Are Securities? All Eyes Turn to Top Shot Case Reuters:  US regulator fines Stoner Cats creator for offering NFTs Ex-OpenSea manager sentenced to 3 months in prison for NFT insider trading Hester Peirce's dissent on the Stoner Cats case Others Paper: The Economic Reality of NFT Securities. Mondaq: Defining NFTs: Property, Securities, Or Commodities? National Post: Trump's newest NFTs show him as superhero, boxer and motorcyclist Timestamps: 00:00 - Introduction 02:11 - SEC targets OpenSea: Unregistered exchange? 03:58 - Stoner Cats case: Weak for SEC? 07:42 - OpenSea's defense: Section 230 protections 13:15 - Impact Theory's promises vs. OpenSea's NFTs 15:34 - Nate Chastain's case 17:15 - Dapper Labs settlement: Relevance to OpenSea 18:56 - Wells notice: SEC's confidence to sue 19:48 - Mann & Frye's lawsuit: SEC overreach? 22:39 - Trump's NFT strategy: Minimizing SEC risk 24:53 - What this Wells notice says about the presidential election 58:25 - News Recap Learn more about your ad choices. Visit megaphone.fm/adchoices

Late Confirmation by CoinDesk
UNCHAINED: If the SEC Sues OpenSea, Here's Why the NFT Platform Could Win Easily

Late Confirmation by CoinDesk

Play Episode Listen Later Aug 30, 2024 36:11


NFT marketplace OpenSea received a Wells notice from the SEC. Crypto lawyer Preston Byrne explains whether Gensler's agency has a chance to win a potential case.The SEC's latest enforcement action is targeting NFTs, and OpenSea is in the crosshairs. In this episode, crypto lawyer Preston Byrne joins to unpack the implications of the SEC's Wells notice to OpenSea and what it might mean for the platform and the broader NFT market. Could Section 230 of the Communications Decency Act provide a unique defense for OpenSea? Preston also dives into other recent SEC moves, including cases against Stoner Cats, Impact Theory, and more. Lastly, with the 2024 elections looming and political divides sharpening, is the SEC overreaching in its approach to crypto?Show highlights:Why Preston believes that the SEC will go after OpenSea for being an unregistered securities exchangeWhat the Stoner Cats case was about and why it was not a strong enforcement action, according to PrestonWhy OpenSea's defense against the SEC may hinge on Section 230 protections for user-generated content, setting it apart from traditional exchanges like Coinbase or BinanceHow the clear-cut promises made by Impact Theory about potential returns made their NFTs resemble securities, unlike the typical art-focused NFTs on OpenSeaWhy Nate Chastain's NFT insider trading case is unlikely to impact the SEC's potential lawsuit against OpenSeaWhether the $4 million settlement by Dapper Labs over NBA Top Shot NFTs likely represents little relevance to OpenSea's SEC issuesWhat a Wells notice signals about the SEC's likelihood of suing OpenSea and why they might feel confident about winning this caseHow Jonathan Mann and Brian Frye's lawsuit for clarity on NFTs as securities highlights the SEC's potentially overreaching stance in its possible case against OpenSeaHow Trump's careful language around his NFT collection likely minimizes SEC risk by avoiding investment promises and focusing on their use as digital collectiblesWhether the SEC's actions could reinforce the divide among crypto voters, with Trump promising a crypto-friendly stance and Harris likely continuing a more adversarial approachVisit our website for breaking news, analysis, op-eds, articles to learn about crypto, and much more: unchainedcrypto.comThank you to our sponsors!iTrustCapitalPolkadotToken 2049Mantle's FBTCGeminiGuestPreston Byrne, Managing Partner at Byrne & StormUnchained Podcast is Produced by Laura Shin Media, LLC. Distributed by CoinDesk.See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Stanford Legal
High Court, High Stakes: The Massive Weight of Recent Supreme Court Rulings

Stanford Legal

Play Episode Listen Later Aug 29, 2024 39:51


The Supreme Court's latest term was marked by decisions of enormous consequence. However, the way the Court has communicated about these rulings far undersells the gravity they carry.While “expressing itself in extremely modest terms,” Professor Jeffrey Fisher says, the current Supreme Court has “[handed] down decisions that have enormously consequential effects for our democracy, people's rights, and everything in between.” He and Assistant Professor Easha Anand, co-directors of the Supreme Court Litigation Clinic, agree that these recent decisions could reshape American law and politics for years to come.In this episode of Stanford Legal with host Pam Karlan, Fisher, and Anand take a critical look at recent Supreme Court rulings on abortion, gun rights, tech platforms, and the power of federal agencies, examining the Court's evolving approach and considering the potential long-term impacts on American democracy and the rule of law.Connect:Episode Transcripts >>> Stanford Legal Podcast WebsiteStanford Legal Podcast >>> LinkedIn PageRich Ford >>>  Twitter/XPam Karlan >>> Stanford Law School PageStanford Law School >>> Twitter/XStanford  Law Magazine >>> Twitter/XLinks:Jeff Fisher >>> Stanford Law School PageEasha Anand >>> Stanford Law School PageStanford Supreme Court Litigation Clinic >>> Stanford Law School Page(00:00:00) Chapter 1: Introduction to the Supreme Court Term and Key CasesPam Karlan is joined by Professors Jeff Fisher and Easha Anand to discuss the past term at the Supreme Court, constitutional law and Supreme Court practice, highlighting key cases and themes from the term. They explore how the court's conservative majority shapes the docket and the role of Justices Barrett and Jackson in developing their judicial voices.(00:06:56) Chapter 2: High-Profile Cases: Guns, Abortion, and Administrative LawExamine major cases, including gun rights in Rahimi v. United States and Cargill v. Garland, abortion-related cases, and the pivotal Loper Bright decision affecting the administrative state. They analyze the court's reasoning and the broader implications of these rulings.(00:15:28) Chapter 3: The Court's Evolving Role and MethodologyDiscussion of the broader implications of the Supreme Court's evolving approach to its docket and decision-making processes, particularly in relation to the administrative state and the impact of recent rulings on future cases.(00:19:14) Chapter 4: The Supreme Court and Technology CasesThey delve into the significant technology cases that were brought before the Supreme Court this term. They discuss how the Court addressed state laws from Florida and Texas aimed at restricting content moderation by big tech companies, marking the first time the First Amendment was applied to social media platforms. The discussion highlights the tension between traditional legal frameworks and the evolving digital landscape, with a focus on the implications of these rulings for the future of free speech online.(00:24:10) Chapter 5: Trump and the Supreme Court: Balancing Power and ImmunityThe group explores the complex legal landscape surrounding former President Donald Trump's involvement in Supreme Court cases. Easha Anand provides an in-depth analysis of the Trump v. United States case, where the Court examined the extent of presidential immunity concerning acts related to the 2020 election. The discussion also touches on the broader implications of the Court's rulings on Trump's legal challenges, including how these decisions might shape future presidential conduct and accountability.(00:29:27) Chapter 6: Supreme Court's Role in Protecting DemocracyPam Karlan and Jeff Fisher discuss the Supreme Court's role in safeguarding democratic processes. They analyze the Court's reluctance to engage deeply in political matters, such as the January 6th prosecution and political gerrymandering, highlighting the tension between judicial restraint and the need to protect democratic values. The chapter concludes with reflections on the broader implications of these decisions for the future of U.S. democracy, particularly in the context of voting rights and election integrity.

Minimum Competence
Legal News for Weds 8/28 - Girardi Guilty, EU-US Split on AI Privacy, Trump Indictment Updated by Smith, TikTok Lawsuit Over Blackout Challenge

Minimum Competence

Play Episode Listen Later Aug 28, 2024 7:02


This Day in Legal History: Alabama Ten Commandments MonumentOn August 28, 2003, the Supreme Court of Alabama took down a monument of the Ten Commandments from its courthouse rotunda, marking the culmination of a high-profile legal battle. The monument had been installed by Chief Justice Roy Moore in 2001, who argued that it reflected the moral foundation of U.S. law. However, this act sparked a federal lawsuit, Glassroth v. Moore, in which three Alabama attorneys claimed the monument violated the Establishment Clause of the First Amendment, which prohibits government endorsement of religion.The federal District Court for the Middle District of Alabama agreed with the plaintiffs, ordering Moore to remove the monument. Moore refused, maintaining that he had a duty to acknowledge God in his official capacity. The case was subsequently appealed to the Eleventh Circuit, which upheld the lower court's ruling. When Moore continued to defy the court orders, the Supreme Court of Alabama intervened, removing him from his position as Chief Justice. This case became a significant moment in the ongoing debate over the separation of church and state in the United States.It is worth noting that Roy Moore, the then-Chief Justice of the Alabama Supreme Court who so vociferously argued for the inclusion of the Ten Commandment monument is the selfsame Roy Moore that, during his 2017 U.S. Senate campaign, saw nine women accuse him of inappropriate conduct. Three of the women claimed they were assaulted by Moore when they were aged 14, 16, and 28. The other six women described Moore pursuing relationships with them when they were as young as 16. Independent witnesses corroborated that Moore had a reputation for approaching teenage girls at a local mall. Moore's responses to the allegations were inconsistent, initially recognizing some accusers but later denying knowledge of any of them. Thomas V. Girardi, a prominent figure in toxic tort litigation, was convicted on four counts of wire fraud in Los Angeles federal court. Once renowned for his work on the Erin Brockovich case and his appearances on "Real Housewives of Beverly Hills," the disbarred attorney faced accusations of defrauding vulnerable clients. The jury reached a unanimous verdict after just four hours of deliberation, rejecting Girardi's defense that his cognitive decline prevented him from forming intent to commit fraud.Prosecutors argued that Girardi knowingly deceived clients, fabricating excuses to explain the missing funds, which he had already spent. The trial centered on the suffering of clients who were betrayed by Girardi in their darkest moments, leading to their financial and emotional devastation. Girardi could face up to 80 years in prison at his sentencing in December. His former CFO, Christopher Kamon, will also stand trial for related charges. The case highlights Girardi's history of evading disciplinary action despite numerous complaints and reveals potential future charges against other senior lawyers at his firm.Thomas Girardi Found Guilty by Jury of Defrauding Clients (2)A recent decision by a German privacy regulator has sparked intense debate about how personal data is handled by AI models like large language models (LLMs). The Hamburg Commissioner for Data Protection concluded that LLMs, despite generating personal data, do not store such information in a way that makes it identifiable, challenging the notion that AI systems can retain personal data. This stance contradicts findings by technologists who argue that LLMs can memorize and reproduce specific data, including personal details. The German position could limit individuals' ability to control their data in AI systems, potentially leading to significant differences in how the U.S. and the EU regulate AI. While California is pushing for laws that explicitly protect personal data in AI, the German approach may set a precedent for a more lenient interpretation under the GDPR. This divergence highlights the complexity of applying traditional privacy laws to AI technologies, with ongoing discussions about how to reconcile these differing perspectives.By way of brief background, LLMs do not directly memorize the training material they are exposed to. Instead, they analyze vast amounts of text data and learn patterns, correlations, and structures within the language, which are then used to generate responses. This learning process involves creating a complex mathematical representation of language—a model—rather than storing specific pieces of text verbatim. However, because these models are trained on enormous datasets, they might sometimes generate outputs that resemble specific phrases or data points encountered during training, especially if those phrases are common or particularly distinctive. This can occasionally lead to the unintentional reproduction of personal or sensitive information from the training data, even though the model itself does not store or recall such information in a traditional, deliberate sense.Of course, that would all be of slim comfort to someone who sees an AI chatbot spit out their home address and social security number in response to a prompt. Personal Info in AI Models Threatens Split in US, EU ApproachSpecial Counsel Jack Smith is moving forward with prosecuting Donald Trump for allegedly attempting to overturn the 2020 election, despite a recent setback from the Supreme Court. The court found that Trump might have partial immunity from prosecution for actions taken as president, leading Smith to file a revised indictment. This new version removes claims related to Trump's communications with government officials, including efforts to involve the Justice Department, but retains the core charges accusing Trump of conspiring to reverse his election loss. The case comes as Trump campaigns for the 2024 election, adding tension to the legal proceedings. Trump criticized the indictment on social media, calling for its dismissal. The updated indictment also cuts references to former Justice Department official Jeffrey Clark as a co-conspirator and modifies Trump's description, downplaying his role as president at the time. The case now focuses more on Trump's role as a candidate rather than his presidential actions. As the case progresses, Trump faces other legal challenges, including cases involving classified documents and charges in Georgia related to the 2020 election.Trump Special Counsel Presses Ahead With 2020 Election Case (3)A U.S. appeals court has revived a lawsuit against TikTok by the mother of a 10-year-old girl who died after attempting a dangerous "blackout challenge" promoted on the platform. The Philadelphia-based 3rd U.S. Circuit Court of Appeals ruled that TikTok is not shielded by Section 230 of the Communications Decency Act, which typically protects internet companies from liability for user-generated content. The court found that Section 230 does not apply when TikTok's algorithm actively recommends harmful content, viewing such recommendations as the company's own speech. This decision marks a shift from previous interpretations of Section 230, which had generally protected platforms from liability for failing to prevent the spread of harmful content. The ruling overturns a lower court's dismissal of the case, allowing the mother, Tawainna Anderson, to pursue claims against TikTok and its parent company, ByteDance, following her daughter Nylah's death in 2021. The case could have significant implications for how tech companies are held accountable for the content their algorithms promote.TikTok must face lawsuit over 10-year-old girl's death, US court rules | Reuters This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit www.minimumcomp.com/subscribe

Daily Signal News
Sen. Marshall: Google Is 'Using Algorithms to Campaign Against' Trump

Daily Signal News

Play Episode Listen Later Aug 2, 2024 22:49


Sen. Roger Marshall says he will launch an investigation into Google after the search engine suppressed content related to the assassination attempt on former President Donald Trump.  Google is “no longer functioning as a search engine that just assimilates” information, says Marshall, R-Kan., “but now they're using algorithms to campaign against President Trump.”  Initiating a Google search of “assassination attempt on president,” quickly reveals that Trump's name is not within the autofill of suggested searches, although former Presidents Harry Truman, Gerald Ford, and Ronald Reagan are.  Google enjoys legal protections under Section 230 of the Communications Decency Act. Those protections shield Google and other platforms, such as Facebook and X, from civil liability for the content users of the platform generate. But if Google wants to act as a publisher, like a news outlet, they should not enjoy Section 230 protections, Marshall says.  The Kansas Republican is calling on Google to explain what he regards as content suppression, but says he thinks that “if we had a strong commander in chief, that they would be intervening already.” Marshall joins “The Daily Signal Podcast” to discuss the investigation into what he asserts is Google's content suppression.  Marshall also weighs in on the plea deal reached with three terrorists behind the 9/11 attacks, including Khalid Sheikh Mohammed, described as the mastermind of the attacks. The deal takes the death penalty off the table for the terrorists imprisoned at Guantanamo Bay, Cuba, in exchange for them pleading guilty to a number of charges, including the murders of nearly 3,000 people on Sept. 11, 2001. The Kansas lawmaker explains why he thinks the deal is a “slap in the face” to the men and women who lost their lives on 9/11, their families, and all who still suffer from physical injuries because of the terrorist attacks.  Enjoy the show!

So to Speak: The Free Speech Podcast
Ep. 221: Section 230 co-author, Rep. Christopher Cox

So to Speak: The Free Speech Podcast

Play Episode Listen Later Aug 1, 2024 58:17


Some argue that Section 230 allows the internet to flourish. Others argue it allows harmful content to flourish. Christopher Cox knows something about Section 230: He co-wrote it.  Section 230 of the Communications Decency Act is an American law passed in 1996 that shields websites from liability for content posted on their sites by users.  What does Rep. Cox make of the law today? Rep. Cox was a 17-year member of the House of Representatives and is a former chairman of the Securities and Exchange Commission.   Timestamps 0:00 Intro 2:43 Did Section 230 create the modern internet? 7:48 America's technological advancement 11:33 Section 230's support for good faith content moderation 18:00 User privacy and age verification? 25:37 Rep. Cox's early experiences with the internet 30:24 Did we need Section 230 in the first place? 37:51 Are there any changes Rep. Cox would make to Section 230 now? 42:40 How does AI impact content creation and moderation? 47:23 The future of Section 230 54:31 Closing thoughts 57:30 Outro   Show notes: Section 230 text “The Twenty-Six Words that Created the Internet” by Jeff Kosseff Cubby, Inc. v. CompuServe Inc. (S.D.N.Y. 1991) Stratton Oakmont, Inc. v. Prodigy Services Co. (N.Y. Sup. Ct. 1995) “Section 230: A Retrospective” by Chris Cox Section 230: Legislative History (Electronic Frontier Foundation)  

ITSPmagazine | Technology. Cybersecurity. Society
The Misinformation Crisis: Navigating Technology and Truth in Modern Society | A Conversation with Joy Scott and Andrew Edwards | Redefining Society with Marco Ciappelli

ITSPmagazine | Technology. Cybersecurity. Society

Play Episode Listen Later Jul 20, 2024 41:28


Guests: ✨ Joy Scott, President, Scott Public Relations [@Scott_PR]On LinkedIn | https://www.linkedin.com/in/scottpublicrelations/Andrew Edwards, Founder and CEO, Verity7On LinkedIn | https://www.linkedin.com/in/andrewvedwards/On Twitter | https://x.com/AndrewVEdwardsOn Instagram | https://www.instagram.com/andrewvedwards1/____________________________Host: Marco Ciappelli, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining Society PodcastOn ITSPmagazine | https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/marco-ciappelli_____________________________This Episode's SponsorsBlackCloak

So to Speak: The Free Speech Podcast
Ep. 216: Section 230 and online content moderation

So to Speak: The Free Speech Podcast

Play Episode Listen Later Jun 6, 2024 81:26


Did 26 words from an American law passed in 1996 create the internet? Section 230 of the Communications Decency Act says that interactive websites and applications cannot be held legally liable for the content posted on their sites by their users. Without the law, it's likely Facebook, Amazon, Reddit, Yelp, and X wouldn't exist — at least not in their current form. But some say the law shields large tech companies from liability for enabling, or even amplifying, harmful content. On today's show, we discuss Section 230, recent efforts to reform it, and new proposals for content moderation on the internet. Marshall Van Alstyne is a professor of information systems at Boston University. Robert Corn-Revere is FIRE's chief counsel. Timestamps 0:00 Intro 3:52 The origins of Section 230? 6:40 Section 230's “forgotten provision” 13:29 User vs. platform control over moderation 23:24 Harms allegedly enabled by Section 230 40:17 Solutions 46:03 Private market for moderation 1:02:42 Case study: Hunter Biden laptop story 1:09:19 “Duty of care” standard 1:17:49 The future of Section 230 1:20:35 Outro Show Notes - Hearing on a Legislative Proposal to Sunset Section 230 of the Communications Decency Act (May 22. 2024) - “Platform Revolution” by Marshall Van Alstyne - “The Mind of the Censor and the Eye of the Beholder” by Robert Corn-Revere - “Protocols, Not Platforms: A Technological Approach to Free Speech” by Mike Masnick - “Sunset of Section 230 Would Force Big Tech's Hand” By Cathy McMorris Rodgers and Frank Pallone Jr. - “Buy This Legislation or We'll Kill the Internet” By Christopher Cox and Ron Wyden - “Free Speech, Platforms & The Fake News Problem” (2021) by Marshall Van Alstyne - “Free Speech and the Fake News Problem” (2023) by Marshall Van Alstyne - “It's Time to Update Section 230” by Michael D. Smith and Marshall Van Alstyne “Now It's Harvard Business Review Getting Section 230 Very, Very Wrong” by Mike Masnick  

The Health Ranger Report
Brighteon Broadcast News, May 31, 2023 – TRUMP CONVICTED in sham trial; Vindictive Dems just made him the MOST POPULAR candidate in history

The Health Ranger Report

Play Episode Listen Later May 31, 2024 135:12


- Trump's trial and acquittal, with reactions from Trump and Attorney General Paxton of Texas. (0:03) - Political corruption and violence. (11:12) - US allowing Ukraine to use missiles to strike Russia. (18:56) - Potential US-Russia nuclear war due to Ukraine conflict. (23:19) - Criminal justice reform in Boston, decriminalizing crimes. (29:00) - Vaccine safety and freedom in Japan, with international appeal. (41:05) - US history of nuclear warfare and potential for future violence. (45:41) - Impending collapse of America, with emphasis on Trump and self-defense gear. (50:15) - Bird flu pandemic, PCR testing fraud, and censorship. (57:39) - The dangers of centralized power, surveillance, and the potential for nuclear war. (1:06:08) - Russia's military capabilities and potential false flag nuclear attack in US. (1:11:17) - The decline of the US Empire and the potential for a new era of freedom. (1:15:56) - Lawsuit against globalist elites for censorship and control. (1:19:29) - US govt funding censorship through foreign NGOs. (1:23:52) - Government censorship and its impact on free speech. (1:28:55) - Censorship lawsuit against government and big tech. (1:33:24) - Section 230 of the Communications Decency Act and its implications for social media platforms. (1:43:26) - Lawsuit against government and tech companies for censorship. (1:49:14) - Population control, neurological damage, and monetary systems. (2:06:17) - Climate change, energy, and censorship. (2:11:02) For more updates, visit: http://www.brighteon.com/channel/hrreport NaturalNews videos would not be possible without you, as always we remain passionately dedicated to our mission of educating people all over the world on the subject of natural healing remedies and personal liberty (food freedom, medical freedom, the freedom of speech, etc.). Together, we're helping create a better world, with more honest food labeling, reduced chemical contamination, the avoidance of toxic heavy metals and vastly increased scientific transparency. ▶️ Every dollar you spend at the Health Ranger Store goes toward helping us achieve important science and content goals for humanity: https://www.healthrangerstore.com/ ▶️ Sign Up For Our Newsletter: https://www.naturalnews.com/Readerregistration.html ▶️ Brighteon: https://www.brighteon.com/channels/hrreport ▶️ Join Our Social Network: https://brighteon.social/@HealthRanger ▶️ Check In Stock Products at: https://PrepWithMike.com

Potential to Powerhouse: Success Secrets for Women Entrepreneurs
77 - Social Media on Trial: Annie McAdams' Battle Against Child Trafficking in the Digital Age – Part 2

Potential to Powerhouse: Success Secrets for Women Entrepreneurs

Play Episode Listen Later May 23, 2024 35:11


On today's episode of Potential to Powerhouse podcast, we caught up with powerhouse Annie McAdams, to hear more about the important work she is doing around social media, child exploitation and sex trafficking. Annie and her firm, is taking on big tech, and going head to head to protect prospective victims against predators who are openly grooming and luring kids for sex online. In this eye-opening second episode with lawyer Annie McAdams, we continue uncovering how big tech companies play a role in child sex trafficking, and how she is holding them accountable.  If you found Part 1 insightful, brace yourself for an even deeper dive in Part 2 as we continue our conversation with the fierce trial lawyer Annie McAdams. After her appearance last year on 60 Minutes, Annie returns to discuss with Tracy the updates on the ongoing battle against major tech companies and their role in facilitating child sex trafficking. Part 2 Key Highlights... Annie's Ongoing Legal Crusade: Updates on Annie's strategies and recent court victories since her notable "60 Minutes" appearance on Potential to Powerhouse. Annie McAdams' Fight Against Tech Companies: Learn how Annie and her team are tackling the misuse of social media platforms in facilitating the sex trafficking of minors, and their efforts to overturn outdated protections like the Communications Decency Act. The Role of Social Media in Trafficking: Insights into how traffickers exploit digital platforms to target minors and the urgent need for stricter platform regulation. Judicial Reforms and Tech Accountability: Exploring the latest legal reforms that aim to hold tech companies responsible for their platforms' misuse. Parental Vigilance: Annie stresses the importance of parental awareness and the false sense of security that tech companies provide about child safety. This episode is a crucial listen for parents, teachers and communities. It sheds light on the dark side of social media providing necessary knowledge to help recognize and combat these threats. I don't know about you, but I have tried every approach to block and monitor unknown or unwanted outreach through different games and devices to my kids. It seems like it is always a battle to really protect our kids from those who we don't know, but who front as kids to our kids online.  This is a legit problem, and Annie is committed to exposing the issue to benefit parents, and stop the big money game of disappearing messages, and clearly ‘kid targeting' social apps that make it almost impossible for parents to monitor and spot their young kids getting targeted. Recap of Key Points from Part 1:  Episode 20 - Taking On Human Trafficking and the Dark Side of Social Media in Big Tech with Attorney Annie McAdams Annie McAdams is at the forefront of a legal battle against the likes of Snapchat, Craig's List, Facebook and Instagram, challenging their roles in facilitating child exploitation. A recent landmark court decision may pave the way for greater accountability of tech companies in cases of online trafficking. Catch her essential tips for parents on monitoring their children's digital interactions to prevent exploitation. Share this episode with friends, family, educators, and community leaders. Educating ourselves and others is vital for giving us the tools we need to be smarter about how our kids are navigating their way online. Engage with Annie McAdams: Website: anniemcadamspc.com Connect with Us: Website: potentialtopowerhouse.com Email: info@potentialtopowerhouse.com Instagram: @potentialtopowerhouse Facebook: Potential to Powerhouse Tune into this essential episode to gain insights into safeguarding your children in the digital era, send us a note and let us know if you heard something important you want to share with our community.

Marketplace Tech
A professor tries to turn the tables on Section 230's web protections

Marketplace Tech

Play Episode Listen Later May 22, 2024 13:39


The internet today is largely governed by 26 words in the Communications Decency Act, signed on Feb. 8, 1996, by then-President Bill Clinton. “Today, with the stroke of a pen, our laws will catch up with our future,” he proclaimed during the signing of the act. The web has changed a bit since then. But Section 230 of that law has not. Today, social media companies routinely use Section 230 to protect themselves from liability over what users post. Now, an internet scholar wants to change that. Will Oremus wrote about him for The Washington Post.

Marketplace All-in-One
A professor tries to turn the tables on Section 230's web protections

Marketplace All-in-One

Play Episode Listen Later May 22, 2024 13:39


The internet today is largely governed by 26 words in the Communications Decency Act, signed on Feb. 8, 1996, by then-President Bill Clinton. “Today, with the stroke of a pen, our laws will catch up with our future,” he proclaimed during the signing of the act. The web has changed a bit since then. But Section 230 of that law has not. Today, social media companies routinely use Section 230 to protect themselves from liability over what users post. Now, an internet scholar wants to change that. Will Oremus wrote about him for The Washington Post.

Minimum Competence
Legal News for Weds 5/22 - Rudy Promises to Stop Defaming, Biden Gets 200th Judge Confirmed, ABA Faces Discrimination Complaints and Section 230 Hearing

Minimum Competence

Play Episode Listen Later May 22, 2024 6:49


This Day in Legal History: Grant Signs the General Amnesty ActOn May 22, 1872, President Ulysses S. Grant signed the General Amnesty Act, marking a significant moment in the post-Civil War reconstruction era. This legislation restored voting rights to most former Confederate rebels who had been disenfranchised under the Fourteenth Amendment as a punishment for their participation in the rebellion. The Act effectively re-integrated approximately 150,000 Southern men back into the political process, leaving only about 500 individuals still excluded from voting and holding office due to their high-ranking roles in the Confederacy.This move was seen as a step towards national reconciliation, aiming to heal the divisions caused by the Civil War. The General Amnesty Act reflected a shift in federal policy from punitive measures towards a more inclusive approach to rebuilding the nation. It acknowledged the need to bring Southern states fully back into the Union by restoring their citizens' civil rights.The process of granting amnesty to former Confederates culminated in full universal amnesty on June 6, 1898. By this time, all remaining restrictions were lifted, allowing every former Confederate the right to vote and hold office. This complete restoration of rights underscored the nation's commitment to moving past its divided history and fostering unity among its citizens. The General Amnesty Act of 1872 was a crucial step in this lengthy process of reconciliation and reintegration.Rudolph Giuliani reached an agreement in bankruptcy court preventing him from making further defamatory statements about Georgia election workers Ruby Freeman and Wandrea' Arshaye “Shaye” Moss. This accord, set to be approved by Judge Sean H. Lane, follows accusations from Freeman and Moss that Giuliani defamed them during an April livestream. They are also pursuing a $148 million defamation verdict awarded to them in December for Giuliani's false claims of voter fraud. Freeman and Moss had filed a lawsuit on May 10, alleging Giuliani continued his defamatory actions. Giuliani's radio show was canceled earlier this month amid these allegations.Giuliani Signs Bankruptcy Court Deal Barring Further DefamationPresident Joe Biden is poised to secure his 200th judicial appointment with the U.S. Senate set to confirm U.S. Magistrate Judge Angela Martinez as a district court judge in Arizona. This achievement surpasses the pace set by his predecessor, Donald Trump, despite initial challenges due to a slim Democratic majority in the Senate. Biden's success in confirming judicial nominees, facilitated by deals with Republican senators, contrasts with Trump's more conservative appointments, which shifted the federal judiciary rightward, including the Supreme Court. Biden has focused on diversity, with two-thirds of his appointees being women and a significant proportion being racial minorities. Despite potential hurdles, the White House aims to continue pushing nominations to avoid more extreme outcomes in future judicial appointments.Biden to secure 200th judicial confirmation as election looms | ReutersThe Wisconsin Institute for Law and Liberty (WILL) has filed complaints against the American Bar Association (ABA), a federal judge, and three law schools, alleging discrimination in student hiring programs. WILL claims these programs violate federal law by using racial quotas and preferences, which they argue have long been illegal. The complaint, filed with the Justice and Education Departments, targets programs that allegedly favor applicants based on race, age, and sexual orientation. South Texas College, the University of the Pacific, and Willamette University are also named in the complaint. Additionally, WILL filed a complaint against Federal Magistrate Judge Leo Brisbois, accusing him of discriminatory practices in ABA's internship and clerkship programs. These actions follow the Supreme Court's 2023 decision to end affirmative action in college admissions. Other conservative groups have similarly challenged diversity programs at major law firms and universities, claiming discrimination against white men.ABA Faces Discrimination Complaint Over Student Hiring ProgramsThe House Energy and Commerce Committee will be holding a legislative hearing today, titled “Legislative Proposal to Sunset Section 230 of the Communications Decency Act.” This hearing aims to discuss draft legislation that would terminate Section 230 and push for new regulations.For those unaware, we have covered Section 230 in a Max Min episode, available via a link in the shownotes. By way of brief background here, or reminder for long time listeners, Section 230 of the Communications Decency Act, enacted in 1996, is a crucial piece of U.S. legislation that provides immunity to online platforms from being held liable for content posted by their users. This law enables websites, including social media networks and forums, to host user-generated content without the risk of facing lawsuits for defamation, libel, or other legal issues arising from that content. Additionally, Section 230 allows platforms to moderate content in good faith, giving them the flexibility to remove or restrict access to content they consider objectionable without being treated as the publisher of that content. This framework has been key in fostering the growth and diversity of the internet as we know it today–for better or worse.While reforming Section 230 has been a contentious topic, sunsetting the law at this juncture is a misguided approach. Much of the internet's infrastructure relies on the protections offered by Section 230, which shields platforms from liability for user-generated content. This admittedly jerry-rigged but essential policy enables the free flow of information and supports innovation by allowing platforms to host diverse viewpoints without fear of constant litigation. Over the last 26 years it has unquestionably done more to shield marginalized communities from the most virulent hate speech than it has been used as cover for bad actors and, while it also unquestionably needs tweaking, it needn't be discarded entirely. Removing the protections of Section 230 could immediately and irrevocably stifle innovation and severely impact small platforms that cannot afford extensive moderation. Although Section 230 is not perfect, completely eliminating it without a robust and well-considered replacement could lead to more harm than good. It is crucial that any legislative changes balance the need for accountability with the preservation of the open internet.Bipartisan Energy and Commerce Leaders Announce Legislative Hearing on Sunsetting Section 230 Get full access to Minimum Competence - Daily Legal News Podcast at www.minimumcomp.com/subscribe

Bob Sirott
Could we see an update to social media regulations?

Bob Sirott

Play Episode Listen Later Mar 5, 2024


Social media strategist Scott Kleinberg joins Bob Sirott to explain the background of social media laws in Florida and Texas, as well as the need for clearer social media regulations and who these laws impact. He also discusses what Section 230 of the Communications Decency Act is and answers this week’s genius bar question.

Rich Zeoli
Elizabeth Warren & Lindsey Graham Look to End Section 230, Censor Speech Online

Rich Zeoli

Play Episode Listen Later Feb 2, 2024 181:56


The Rich Zeoli Show- Full Episode (02/01/2024): 3:05pm- On Wednesday, Meta CEO Mark Zuckerberg testified before the Senate Judiciary Committee. During Sen. Lindsey Graham's (R-SC) opening statement, he accused Zuckerberg of having blood on his hands—emphatically stating “you have a product that is killing people” as those in attendance applauded. Zuckerberg was also notably grilled by Sen. Ted Cruz (R-TX) and Sen. Josh Hawley (R-MO). Could strict regulations on social media be coming via new legislation? Though there is clearly bipartisan support for legislation to upend Section 230, government should not be in the business of regulating speech online—it ultimately leads to the censorship of American citizens expressing opinions contrary to mainstream narratives. 3:10pm- Meta CEO Mark Zuckerberg publicly apologized to families negatively impacted by social media after being implored by Sen. Josh Hawley (R-MO) to take accountability. Hawley, who notably wrote the “Tyranny of Big Tech” in 2021, has asserted that social media is one of the biggest threats to America. 3:15pm- A newly released Quinnipiac poll shows President Joe Biden with a 6-point national lead over Republican presidential candidate Donald Trump. Though, as Rich notes, the polling results may not be as problematic for Trump as the media is making it seem. First, the poll relied on “registered” voters, not “likely” voters. Plus, it's a national poll and doesn't consider state polling. According to a new Bloomberg News/Morning poll, Trump leads Biden in a hypothetical 2020 rematch by 3 to 10 points in Wisconsin, Pennsylvania, North Carolina, Nevada, Michigan, Georgia, and Arizona. Trump's projected lead expands even further when considering the third-party candidacies of Robert Kennedy Jr., Jill Stein, and Colonel West. 3:30pm- Heather Knight of The New York Times writes: “Fifteen months after city officials were ready to throw a party in the Noe Valley Town Square to celebrate funding for a tiny bathroom with a toilet and sink, nothing but mulch remains in its place. The toilet project broke down the minute taxpayers realized the city was planning an event to celebrate $1.7 million in state funds that local politicians had secured for the lone 150-square-foot structure. That's enough to purchase a single-family home in San Francisco—with multiple bathrooms. Even more confounding was the explanation that the tiny bathroom would take two to three years to install because of the city's labyrinthine permitting and building process. City leaders quickly canceled their potty party, and Gov. Gavin Newsom of California took back the funds.” You can read the full story about San Francisco's $1.7 million toilet here: https://www.nytimes.com/2024/01/24/us/san-francisco-toilet.html 3:45pm- Bud-Light has announced a partnership with comedian Shane Gillis. Will the Gillis affiliation be enough for Bud-Light to bounce back from the criticism it received after working with transgender social media influencer Dylan Mulvaney. 4:05pm- Robby Soave of Reason writes: “There is no pastime more beloved by Congress than beating up on social media executives. On Wednesday, members of the Senate Judiciary Committee engaged in yet another round of fact-free histrionics as they thunderously denounced four tech CEOs—Meta's Mark Zuckerberg, X's Linda Yaccarino, Snapchat's Evan Spiegel, and Discord's Jason Citron—for­ a litany of allegedly unsafe business practices… Many of the Senate's anti-tech crusaders were present, including Republican Sens. Lindsey Graham (SC), Ted Cruz (TX), and Josh Hawley (MO), and Democratic Sens. Dick Durbin (IL), Amy Klobuchar (MN), and Richard Blumenthal (CT). Sen. Elizabeth Warren (MA) wasn't there, though she received several favorable shout-outs from the Republicans. Indeed, both sides of the political aisle were exceedingly pleased with themselves for acting in bipartisan fashion to wildly accuse four business leaders of complicity in despicable crimes against children… In order to obtain this control, senators from both parties have sponsored legislation to repeal or reform Section 230, the federal statute that protects internet companies from some liability. Section 230 was a frequent punching bag at the Wednesday hearing.” You can read the full article here: https://reason.com/2024/02/01/mark-zuckerberg-senate-hearing-graham-facebook/ 4:15pm- While appearing on CNN, Senator Dick Durbin (D-IL) said: “I'm not going to protect Section 230 at the expense of children…if 230 has to go, it has to go.” Section 230 of the Communications Decency Act notably shields social media companies from liability for speech posted to their platforms. 4:35pm- David Propper of The New York Post writes: “A Montana family claims they lost custody of their 14-year-old child after opposing her interest in changing genders — and while the governor's office defended the move, it stressed to The Post that the state does not remove minors to provide gender transition services. The state's Child and Family Services (CFS) reportedly took custody of the teen from her father, Todd Kolstad, and stepmother, Krista, this month, leading the parents to speak out about how the action has ‘destroyed' their family and ‘trampled' their rights.” You can read the full report here: https://nypost.com/2024/01/30/news/montana-parents-lose-custody-of-daughter-after-opposing-transition-report/ 4:50pm- Rich opens the phone lines and gets flooded with off-topic calls—Matt is reprimanded. 5:05pm- On Wednesday, Meta CEO Mark Zuckerberg testified before the Senate Judiciary Committee. During Sen. Lindsey Graham's (R-SC) opening statement, he accused Zuckerberg of having blood on his hands—emphatically stating “you have a product that is killing people” as those in attendance applauded. Zuckerberg was also notably grilled by Sen. Ted Cruz (R-TX) and Sen. Josh Hawley (R-MO). Could strict regulations on social media be coming via new legislation? Though there is clearly bipartisan support for legislation to upend Section 230, government should not be in the business of regulating speech online—it ultimately leads to the censorship of American citizens expressing opinions contrary to mainstream narratives. 5:15pm- While speaking from the House floor, Congresswoman Pramila Jayapal (D-WA) voiced her disagreement with proposed legislation that would allow for the deportation of undocumented migrants convicted of driving while intoxicated (DWI). She also implored society to stop referring to migrants who enter the U.S. unlawfully as “illegals.” 5:20pm- In audio leaked from The San Francisco Standard, Governor Gavin Newsom (D-CA) can be heard retelling a story about how he witnessed a theft while at Target, and how he reprimanded a worker for suggesting California isn't tough on crime. 5:30pm- Teri Hatcher, who was a “Bond-girl” in the film Tomorrow Never Dies, says she is officially done with online dating and is content with being single. 5:45pm- In his latest piece for The New York Post, George Washington University Law Professor Jonathan Turley writes: “Hunter Biden is comparable to children in Japanese internment camps, to undocumented immigrants, to the murdered descendants of the Tsar. At least that's what he argues in a new court filing in his federal gun case, which presents Hunter as one of the most tragic figures since the fall of Troy. Literally. In a brief that borders on delusional, Biden's lawyers say the son of the president who burned through millions from influence peddling is comparable to all those unfortunate and destitute souls.” You can read the full article here: https://nypost.com/2024/01/31/opinion/hunter-biden-compares-self-to-a-romanov-a-migrant-child-and-a-greek-tragedy-in-delusional-court-filing/ 6:05pm- Rich is busy hosting a stand-up comedy show at Club 360 at Parx Casino tonight—so, Mike Opelka hosts hour 4 of the show! 6:10pm- Tomorrow is Groundhog Day! Will you be in Gobbler's Knob partying? Interestingly, PETA is calling for replacing the groundhog with a simple coin toss to determine whether there will be six more weeks of winter. According to calculations, since 1887, Punxsutawney Phil has accurately predicted the winter weather just 39% of the time. 6:35pm- Tommy Christopher of Mediaite writes: “Fulton County District Attorney Fani Willis is standing firm in the face of efforts to get her to step down from the blockbuster election crimes case against former President Donald Trump and 18 co-defendants. Trump and others have demanded DA Willis be removed from the 34-count felony election crimes case on the basis of allegations made in a court filing by one of Trump's co-defendants. Willis is accused of an “improper” relationship with special prosecutor Nathan Wade, whom she hired as part of the Trump election crimes team. But according to a CNN exclusive by Zachary Cohen, Willis has decided she's not going anywhere.” You can read more here: https://www.mediaite.com/news/cnn-sources-say-trump-election-crimes-prosecutor-fani-willis-will-not-step-down-from-blockbuster-rico-case/ 6:40pm- On Thursday, U.S. Capitol Police announced they will not charge a former Senate staffer who allegedly had sex in a Congressional hearing room last year. The incident reportedly occurred where Sen. Amy Klobuchar (D-MN) sits during Senate Judiciary hearings. 

Rich Zeoli
Should Zuckerberg Be Held Responsible for Everything on Facebook & Instagram?

Rich Zeoli

Play Episode Listen Later Feb 1, 2024 45:47


The Rich Zeoli Show- Hour 2: Robby Soave of Reason writes: “There is no pastime more beloved by Congress than beating up on social media executives. On Wednesday, members of the Senate Judiciary Committee engaged in yet another round of fact-free histrionics as they thunderously denounced four tech CEOs—Meta's Mark Zuckerberg, X's Linda Yaccarino, Snapchat's Evan Spiegel, and Discord's Jason Citron—for­ a litany of allegedly unsafe business practices… Many of the Senate's anti-tech crusaders were present, including Republican Sens. Lindsey Graham (SC), Ted Cruz (TX), and Josh Hawley (MO), and Democratic Sens. Dick Durbin (IL), Amy Klobuchar (MN), and Richard Blumenthal (CT). Sen. Elizabeth Warren (MA) wasn't there, though she received several favorable shout-outs from the Republicans. Indeed, both sides of the political aisle were exceedingly pleased with themselves for acting in bipartisan fashion to wildly accuse four business leaders of complicity in despicable crimes against children… In order to obtain this control, senators from both parties have sponsored legislation to repeal or reform Section 230, the federal statute that protects internet companies from some liability. Section 230 was a frequent punching bag at the Wednesday hearing.” You can read the full article here: https://reason.com/2024/02/01/mark-zuckerberg-senate-hearing-graham-facebook/ While appearing on CNN, Senator Dick Durbin (D-IL) said: “I'm not going to protect Section 230 at the expense of children…if 230 has to go, it has to go.” Section 230 of the Communications Decency Act notably shields social media companies from liability for speech posted to their platforms. David Propper of The New York Post writes: “A Montana family claims they lost custody of their 14-year-old child after opposing her interest in changing genders — and while the governor's office defended the move, it stressed to The Post that the state does not remove minors to provide gender transition services. The state's Child and Family Services (CFS) reportedly took custody of the teen from her father, Todd Kolstad, and stepmother, Krista, this month, leading the parents to speak out about how the action has ‘destroyed' their family and ‘trampled' their rights.” You can read the full report here: https://nypost.com/2024/01/30/news/montana-parents-lose-custody-of-daughter-after-opposing-transition-report/ Rich opens the phone lines and gets flooded with off-topic calls—Matt is reprimanded.

TonioTimeDaily
Sexually living my life above reproach and romantically living my life above reproach (revenge pornography)

TonioTimeDaily

Play Episode Listen Later Jan 29, 2024 96:04


“Revenge porn is a type of digital abuse in which nude or sexually explicit photos or videos are shared without the consent of those pictured. Also called nonconsensual pornography, it's closely related to sexual abuse. A current or previous partner may share such images as “revenge” or threaten to distribute them as a type of blackmail. You may have sent these private images to a partner. A partner may have convinced you to take explicit pictures, possibly in an effort to control or shame you. An abusive partner could even take sexual or nude photos of you without your knowledge. Revenge porn isn't limited to romantic partners, though. A co-worker, family member, or stranger could also gain access to your private images and share them publicly for a variety of reasons. But it's important to understand that someone else's harmful actions aren't a reflection of your self-worth. Once you've processed your trauma, take steps to reconnect with your community and take part in social activities that you enjoy. Harassment and harm to reputation. Revenge porn posts may include your name, links to your social media accounts, and even your phone number. In one study, almost half of people affected by revenge porn said others had harassed or stalked them online. Some victims have reported that revenge porn caused them to lose their jobs or damaged their family relationships. You might decide not to apply for a job for fear a prospective employer would come across your images in an online search. Is Revenge Porn Illegal? Forty-six states and the District of Columbia have laws against revenge porn. Only Wyoming, Mississippi, South Carolina, and Massachusetts lack this kind of law. There's no federal law against revenge porn. But in all states, it's illegal to share sexual videos or pictures of anyone under age 18. The Communications Decency Act of 1996 regulates porn on the internet. It says websites and internet providers don't have legal responsibility for pictures or videos posted by their users. That means they're not legally required to take down revenge porn unless it breaks copyright or federal criminal laws. Some may do so voluntarily if the content violates their user guidelines. How Can Revenge Pornography Affect You? Revenge porn can harm you in several ways: Psychological issues. You may deal with long-term personal and psychological issues after private images are posted publicly. One study found that up to 93% of those involved in revenge porn had major emotional distress, such as guilt, depression , paranoia , anger, or suicidal thoughts . If you're having these feelings, seek help from a mental health professional. Some specialize in sexual trauma. Call the suicide and crisis hotline at 988 or visit 988lifeline.org if you're thinking of hurting yourself. Social anxiety and isolation. If you're a victim of revenge porn, you might start to withdraw from social settings and become isolated. It can make you feel worthless or ashamed. Here are some steps you can take: Document everything. You'll probably have the urge to immediately take down any personal images you find on the internet. But if you want to take legal action, you need to document this information. Before you delete anything, collect data. Take screenshots, download images, and turn website pages into PDFs. Get an order of protection, which creates a legal action to stop abuse. It can legally block an abusive person from communicating with you. These orders are public, which means anyone can see them. Get your images copyrighted. This can make it easier for you to get control over them and take them down from the internet. You can also hire private companies (takedown services) to remove your images for a fee.” -https://www.webmd.com/sex-relationships/ss/slideshow-sex-drive-changes-age. "Online piracy is the practice of downloading and distributing copyrighted works digitally without permission, such as music or software.[1][2]" -Wikipedia. --- Support this podcast: https://podcasters.spotify.com/pod/show/antonio-myers4/support

The Free Thought Project Podcast
Guests: Ryan Hartwig & Jason Fyk - The Social Media Gulag & The Free Speech Implications of Porn

The Free Thought Project Podcast

Play Episode Listen Later Jan 23, 2024 62:00


On this episode of the Free Thought Project podcast, Matt and Don are joined by Ryan Hartwig and Jason Fyk for a deep dive into the complex web of free speech, government policies, and big tech. Ryan Hartwig, a former content moderator for Facebook, blew the whistle on internal practices, revealing biases and censorship. His insights into the workings of big tech are crucial in understanding the current digital landscape. Jason Fyk, the founder of The Social Media Freedom Foundation, fights the legal battles against the intertwining of government and tech giants, particularly focusing on the misuse of Section 230. In this riveting discussion, our guests unravel how the U.S. government and big tech companies have co-opted Section 230 of the Communications Decency Act, turning it into a tool to suppress Americans' First Amendment rights. The conversation explores the alarming consequences of such actions on the fundamental freedoms of speech and expression. The dialogue also ventures into the controversial topic of pornography. While acknowledging its negative societal impacts, Ryan and Jason discuss the nuances of addressing this issue without resorting to government overreach or outright bans, underscoring the importance of preserving free speech. The podcast concludes on a hopeful note, with both guests offering their solutions to these pressing issues, reiterating the need for a balanced approach that protects individual freedoms while addressing the challenges posed by the digital age. (Length: 1:02:00) Ryan on Twitter: https://twitter.com/hartwig_free Ryan's Book: https://www.amazon.com/Behind-Mask-Facebook-Whistleblowers-Censorship-ebook/dp/B08X6V3836 Jason Fyk's twitter: https://twitter.com/JasonFyk Social Media Freedom Foundation: https://socialmediafreedom.org/

So to Speak: The Free Speech Podcast
Ep. 204: “Liar in a Crowded Theater” with Jeff Kosseff

So to Speak: The Free Speech Podcast

Play Episode Listen Later Jan 18, 2024 62:44


Jeff Kosseff is an associate professor of cybersecurity law in the United States Naval Academy's Cyber Science Department. He is the author of four books including his most recent, “Liar in a Crowded Theater: Freedom of Speech in a World of Misinformation.” He has also written books about anonymous speech and Section 230 of the Communications Decency Act.   Timestamps   0:00 Introduction 2:30 Jeff's focus on the First Amendment  4:27 What is Section 230? 9:30 “Liar in a Crowded Theater” 16:27 What does the First Amendment say about lies? 19:35 What speech isn't protected?  21:27 The Eminem case  27:33 The Dominion lawsuit  38:44 “The United States of Anonymous” 46:39 The impact of age verification laws  49:43 “The Twenty-Six Words that Created the Internet” 58:40 What's next for Jeff?  1:01:35 Outro    Show Notes    Brown v. Entertainment Merchants Association (2011) FIRE's guide to Section 230 Nikki Haley on social media anonymity Schenck v. United States (1917) “The Twenty-Six Words That Created the Internet” by Jeff Kosseff NBC News: “Judge allows lawsuit against Snap from relatives of dead children to move forward” “The United States of Anonymous: How the First Amendment Shaped Online Speech” by Jeff Kosseff United States v. Alvarez (2012)

Truth Nation Podcast
Snapchat in the Spotlight: A Landmark Ruling on Drug Sales Liability

Truth Nation Podcast

Play Episode Listen Later Jan 16, 2024 58:56


In October 2022, a group of families who lost children to fentanyl sold over Snapchat filed a lawsuit against the company, in Los Angeles County Superior Court. According to the legal filings, several of the plaintiffs had met with Snapchat executives in April of 2021. During that meeting the plaintiffs allege Snapchat executives told them the company was immunized from civil liability by Section 230 of the Communications Decency Act. On January 2, 2024, in what may turn out to be one of the most significant legal decisions of the decade, Los Angeles Superior Court Judge Lawrence Riff ruled the case would go forward. This is the first time in history a social media company is subject to claims it facilitated illegal and fatal drug sales. Join retired California Highway Patrol Chief Mark Garrett and retired DEA Special Agent in Charge Bill Bodner on Episode 6 of the Truth Nation Podcast as they discuss the ruling, the specific allegations in this case, how drugs are sold using the Snapchat platform, and how social media platforms use an outdated Section 230 as a shield to selectively censor and put profits before safety. If you're the parent of a child under the age 18, this is a must listen!

One Clap Speech and Debate Podcast
Rock On! Debate: Public Forum 2024 January Topic Analysis with Matt Liu

One Clap Speech and Debate Podcast

Play Episode Listen Later Dec 31, 2023 31:18


5:38 - Hey, Public Forum Debaters! University of Wyoming Director of Debate Matt Liu is here to provide an overview analysis to help you absolutely obliterate the January 2024 Public Forum topic: The United States federal government should repeal Section 230 of the Communications Decency Act.You should check out the Wyoming Debate Roundup write-up of Matt's analysis here:https://wyodebateroundup.weebly.com/blog/pf-janfeb-2024-topic-analysis-section-230For more notes and details about the episode, check out the One Clap website post here:https://www.oneclapspeechanddebate.com/post/rock-on-debate-public-forum-topic-analysis-2024-january-section-230If you have any ideas or requests for topics to explore on the One Clap Podcast, shoot Lyle an email at lylewiley@gmail.com or check out our blog and social media here:One Clap Website: www.oneclapspeechanddebate.comYouTube: www.youtube.com/channel/UCyvpV56859lLA-X-EvHVYUgFacebook: @oneclappodcastInstagram: @one_clap_podcastTwitter: @OneClapPodcastTikTok: @oneclapspeechanddebateGet your cool One Clap Speech and Debate merchandise here: https://www.bonfire.com/store/one-clap-speech-and-debate/Support the show

So to Speak: The Free Speech Podcast
Ep. 202: The backpage.com saga

So to Speak: The Free Speech Podcast

Play Episode Listen Later Dec 21, 2023 62:56


We're joined today by Elizabeth Nolan Brown, Robert Corn-Revere, and Ronnie London to discuss the history and verdict of the Backpage trial.  Backpage.com was an online classified advertising service founded in 2004. As a chief competitor to Craigslist, Backpage allowed users to post ads to categories such as personals, automotive, rentals, jobs and — most notably — adult services. In 2018, the website domain was seized by the FBI and its executives were prosecuted under federal prostitution and money laundering statutes. The trial concluded this year, resulting in the acquittal and convictions of several key executives.  Some First Amendment advocates are concerned that the Backpage case represents a “slippery slope” for the prosecution of protected speech and the rights of websites that host user-generated content. Elizabeth Nolan Brown is a senior editor at Reason Magazine, where she has written about the Backpage case in detail.  Robert Corn-Revere is FIRE's chief counsel and a frequent guest of the show. Prior to joining FIRE, he represented Backpage in private practice. Ronnie London is FIRE's general counsel and another frequent guest of the show. He also represented Backpage when he was in private practice prior to joining FIRE. Timestamps 00:00 Introduction 06:55 The origins of Backpage 10:40 The significance of classified ads 14:52 Are escort ads protected? 19:07 Federal memos indicating Backpage fought child sex trafficking 23:19 Backpage content moderation 34:44 Section 230 of the Communications Decency Act 42:59 “De-banking” and NRA v. Vullo 52:24 The verdict 1:00:34 Could these convictions be overturned? 1:02:49 Outro Show notes  Backpage.com url 2018 Backpage indictment Elizabeth Nolan Brown's 2018 Backpage profile Section 230 of the Communications Decency Act NRA v. Vullo The Travel Act

Marketplace Tech
Dating apps fail to protect some users from predators, Mother Jones finds

Marketplace Tech

Play Episode Listen Later Aug 16, 2023 13:10


Warning: This episode contains references to sexual abuse and violence. Whether for a hookup or to find true love, 3 out of 10 American adults say they have used a dating app, according to the Pew Research Center. But an investigation out Wednesday from Mother Jones looks into how these apps can also incubate abuse, finding that companies like Grindr and Match Group have failed to protect some of their users from predators. At the heart of this story is this question: Is that the companies’ responsibility? The tech industry has long argued the answer is no, thanks to Section 230 of the Communications Decency Act, which protects internet companies from liability for content posted xx on their sites. Abby Vesoulis is the author of the Mother Jones investigation. Her story begins with Matthew Herrick, whose ex-boyfriend created fake profiles of him on Grindr.

Marketplace All-in-One
Dating apps fail to protect some users from predators, Mother Jones finds

Marketplace All-in-One

Play Episode Listen Later Aug 16, 2023 13:10


Warning: This episode contains references to sexual abuse and violence. Whether for a hookup or to find true love, 3 out of 10 American adults say they have used a dating app, according to the Pew Research Center. But an investigation out Wednesday from Mother Jones looks into how these apps can also incubate abuse, finding that companies like Grindr and Match Group have failed to protect some of their users from predators. At the heart of this story is this question: Is that the companies’ responsibility? The tech industry has long argued the answer is no, thanks to Section 230 of the Communications Decency Act, which protects internet companies from liability for content posted xx on their sites. Abby Vesoulis is the author of the Mother Jones investigation. Her story begins with Matthew Herrick, whose ex-boyfriend created fake profiles of him on Grindr.

The Lawfare Podcast
Rational Security: The “Not, Like, the Three Greatest Experts at Podcasting” Edition

The Lawfare Podcast

Play Episode Listen Later Feb 26, 2023 72:30


This week on Rational Security, Alan, Quinta, and Scott sat through literally hours of oral arguments to prepare to discuss all the national security developments in the news, including:“The HIMAR Anniversary.” The war in Ukraine is one year old this week. The Biden administration marked the occasion with a presidential visit to Kyiv and a finding of crimes against humanity, while Vladimir Putin celebrated by moving the Doomsday Clock a bit closer to midnight. What should we make of where the war stands one year in?“We're Living in a Post-Algorithm World, and I'm a Post-Algorithm Girl.” So said Justice Elena Kagan (more or less), as she and the other members of the Supreme Court heard arguments in Gonzalez v. Google and Twitter v. Taamneh on terrorism liability and the scope of protections under Section 230 of the Communications Decency Act—a case that some argue could break the internet. What did we learn from oral arguments? And what might the ramifications be?“Bold Dominion.” Dominion Voting Systems filed a stunning brief in its defamation lawsuit against Fox News earlier this week, which lays out in 200 detailed pages the extent to which Fox's executives and on-air personalities knowingly amplified lies about the company's conduct around the 2020 election. What did we learn about Fox's culpability? And what would a Dominion win mean moving forward?Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

The Lawfare Podcast
Lawfare Archive: The Good, the Bad and the Ugly of Section 230 Reform

The Lawfare Podcast

Play Episode Listen Later Feb 25, 2023 56:46


From March 18, 2021: On this episode of Arbiters of Truth, the Lawfare Podcast's miniseries on our online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Daphne Keller, the director of the Program on Platform Regulation at Stanford's Cyber Policy Center and an expert on Section 230 of the Communications Decency Act, the statute that shields internet platforms from civil liability for third-party content on their websites. The statute has been criticized by both Democrats and Republicans, and both President Trump and President Biden separately called for its repeal. So what should we expect in terms of potential revision of 230 during the current Congress? What does Daphne think about the various proposals on the table? And how is it that so many proposals to reform 230 would be foiled by that pesky First Amendment?Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

#SistersInLaw
118: Trump Investigations, Legal Lies & The Internet

#SistersInLaw

Play Episode Listen Later Feb 25, 2023 73:53


#SistersInLaw On Tour: Go to politicon.com/tour to get ready for our live tour in May!  We're starting off in  Portland, OR– May 12.   New York City, NY– May 19.  Washington, DC– May 21.   Get your tickets today - politicon.com/tour  #SistersInLaw break down the latest subpoenas and investigation developments involving Trump– including the targeting  of Jared and Ivanka and the grand juror's ill-advised media tour.  Then, they look at the lies of Mark Brnovich relating to important 2020 election information, before discussing Section 230 of the Communications Decency Act and anti-terrorism law. WEBSITE & TRANSCRIPT Email: SISTERSINLAW@POLITICON.COM or tweet using #SistersInLaw From #SistersInLaw From Barb On grand juror Emily Kors going on a media tour Please Support This Week's Sponsors Reel Paper:  Get 30% off your first order and free shipping on bamboo based environmentally friendly paper products by going to reelpaper.com/sisters and signing up for a subscription using promo code: SISTERS. Calm: Perfect your meditation practice and get better sleep with a 40% off a premium subscription when you go to calm.com/sisters  Noom:  Sign up for a trial of effective weight loss solutions with Noom and check out their groundbreaking book on health when you go to noom.com/sistersinlaw HelloFresh:  Enjoy 65% off plus free shipping on delicious HelloFresh meals delivered right to your door when you go to hellofresh.com/sisters65 and use promo code: SISTERS65 Get More From #Sisters In Law Joyce Vance: Twitter | University of Alabama Law | MSNBC | Civil Discourse Substack Jill Wine-Banks: Twitter | Facebook | Website | Author of The Watergate Girl: My Fight For Truth & Justice Against A Criminal President Kimberly Atkins Stohr: Twitter | Boston Globe | WBUR | Unbound Newsletter Barb McQuade: Twitter | University of Michigan Law | Just Security | MSNBC

The Indicator from Planet Money
The 26 Words That Made The Internet What It Is (Encore)

The Indicator from Planet Money

Play Episode Listen Later Feb 22, 2023 9:34


How one man's legal fight turned 26 ambiguous words from a 1996 law into the shield big tech companies use today. This key part of Section 230 of the Communications Decency Act is at the heart of two cases being argued this week before the Supreme Court.This episode originally came out in April 2021.

The Lawfare Podcast
Gonzalez v. Google and the Fate of Section 230

The Lawfare Podcast

Play Episode Listen Later Feb 17, 2023 70:17


On February 14, the Brookings Institution hosted an event on the upcoming Supreme Court oral arguments in Gonzalez v. Google and Twitter v. Taamneh—two cases that could potentially reshape the internet. The Court is set to hear arguments in both cases next week, on February 21 and 22. Depending on how the justices rule, Gonzalez could result in substantial changes to Section 230 of the Communications Decency Act, the bedrock legal protection on which the internet is built. For today's podcast, we're bringing you audio of that discussion. Lawfare senior editor Quinta Jurecic moderated a panel that included Hany Farid, a professor at the University of California, Berkeley, with a joint appointment in electrical engineering & computer sciences and the School of Information; Daphne Keller, the director of the Program on Platform Regulation at Stanford University's Cyber Policy Center; Lawfare senior editor Alan Rozenshtein; and Lawfare editor-in-chief Benjamin Wittes.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.

Bill O’Reilly’s No Spin News and Analysis
State of the Union Fact-Check, Professor Steve Schier Reacts to Biden's Claims, Sarah Huckabee Sanders' Rebuttal, & More

Bill O’Reilly’s No Spin News and Analysis

Play Episode Listen Later Feb 9, 2023 48:07


Tonight's rundown:  Talking Points Memo: Bill breaks down President Biden's State of the Union, fact-checking the President's many mistruths. Political Science Professor Steve Schier joins the No Spin News Arkansas Gov. Sarah Huckabee Sanders' strong Republican rebuttal to Joe Biden Eleven Democrat run cities are on the most dangerous cities in the world list This Day in History: President Clinton sign the Communications Decency Act into law Final Thought: Join Team Normal Learn more about your ad choices. Visit megaphone.fm/adchoices