POPULARITY
All Minnesota GOP lawmakers in Congress joined their fellow Republicans in voting for a budget framework that includes $2 trillion in spending cuts. While it doesn't specify the programs, Republicans have targeted Medicaid and food aid programs.Tribal leaders are asking members of Congress to address funding concerns and uphold federal government's treaty obligations to tribes. Leech Lake Band of Ojibwe Secretary-Treasurer Leonard Fineday testified to a House Appropriations subcommittee Tuesday afternoon.A new report released Wednesday by the Minnesota Chamber Foundation found that nearly 60 percent of the state's total labor force and employment growth came from foreign-born workers from 2019 to 2023.Those stories and more in today's morning update. Hosted by Gracie Stockton.
Organisations fighting against the abuse of children are once against making a call for the term child pornography to be done away with. The organisations believe the right term - which shows the gravity of the crime - should be called, child sexual abuse material. They say, using the term child pornography assumes that a child consented to be part of the act and minimises the fact that they are victims of violence and abuse. In its 2021 report – the South African Law Commission also recommended amendments to the Sexual Offences Act and Child Justice Act, in favour of the term Child Sexual Abuse Material. Neria Hlakotsa filed this report...
Today is Wednesday, Dec. 11. Here are some of the latest headlines from the Fargo, North Dakota area. InForum Minute is produced by Forum Communications and brought to you by reporters from The Forum of Fargo-Moorhead and WDAY TV. For more news from throughout the day, visit InForum.com.
A 50 year old Penn Valley man was arrested after a search of his home turned up multiple forms of child sexual abuse material. The Nevada County Sheriff's Office Major Crimes Unit is urging anyone who suspects that their child may have been a victim to contact them at (530) 265-1471.
Today is Thursday, Nov. 21, 2024. The Brainerd Dispatch Minute is a product of Forum Communications Co. and is brought to you by reporters at the Brainerd Dispatch. Find more news throughout the day at BrainerdDispatch.com. The Brainerd Dispatch is proud to be a part of the Trust Project. Learn more at thetrustproject.org.
We talk about a lot of sensitive topics as school counselors, and today is no exception. We're taking a deep dive into child sex trafficking. Heavy, I know. But with sex trafficking happening all around us, this just might be the conversation you never knew you needed to hear.I'm joined by Krystal Nierman of Switch, an organization fighting to end sex trafficking and sexual exploitation. Krystal shares what sex trafficking really looks like…and there's a good chance it's not what you think! Plus, you'll hear how to spot the red flags among students, what to do if you suspect it's happening, and actions we can take (and teach to others) to help prevent sex trafficking.Be sure to check the links below for more valuable resources on this topic. This is something that could be affecting students at your own school in various ways, so keep asking questions and follow your gut if you feel like something is off. Share this episode with fellow counselors to help spread awareness around child sex trafficking!Resources Mentioned: Free Resource: Editable Scholarship Spreadsheet - https://counselorclique.com/scholarship Switch Website: https://switchsc.orgNational Human Trafficking Hotline: 1-888-373-7888National Center for Missing and Exploited Children (NCMC): https://www.missingkids.org/home Child Sex Trafficking: https://www.missingkids.org/theissues/trafficking Child Sexual Abuse Material: https://www.missingkids.org/theissues/csam National Center on Sexual Exploitation (NCOSE): https://endsexualexploitation.orgConnect with Lauren:Sign up for the free, 3-day prep for High School Counseling Job Interviews https://counselorclique.com/interviewsVisit my TpT store https://counselorclique.com/shopSend me a DM on Instagram @counselorclique https://instagram.com/counselorcliqueFollow me on Facebook https://facebook.com/counselorcliqueSend me an email mailto:lauren@counselorclique.comJoin the Clique Collaborative http://cliquecollab.comFull show notes on website: https://counselorclique.com/episode150
Child Sexual Abuse Material detected in the Netherlands has almost doubled, from 47% of the total that they detected globally in 2018, to 71% in 2019. This is due to a pervasive business model of “bulletproof hosting”, which takes advantage of the more permissive legal system and excellent technical infrastructure that The Netherlands provide. […] The post The Netherlands & USA * Child Pornography Ring Connections More like PARTNERS? Rosa Koire & Susan Wojcicki, YouTube CEO, dead from lung cancer: Was it MURDER? appeared first on Psychopath In Your Life.
Monday, August 12th 2024Today, the nation's oldest and largest Latino civil rights organization endorses Kamala Harris for president; a Former North Dakota State Senator pleads guilty to traveling to Prague to have commercial sex with children; a Capitol riot defendant has been jailed over alleged threats against Supreme Court justice and other officials; Trump is losing support from ultra right wing influencers; the US has announced another $125M in support of Ukraine; plus Allison and Dana deliver your Good News.Promo Code:Helix is offering up to 20% off all mattress orders AND two free pillows for our listeners! Go to https://www.helixsleep.com/dailybeans.StoriesTrump Has Started to Piss Off White Supremacists (The New Republic)LULAC, nation's oldest and largest Latino civil rights organization, endorses Kamala Harris for president (CBS News)Powerful former North Dakota lawmaker pleads guilty to traveling to Europe to pay for sex with minor (AP News)Capitol riot defendant jailed over alleged threats against Supreme Court justice and other officials (AP News)Give to the Kamala Harris Presidential CampaignKamala Harris (MSW Media Donation Link) — Donate via ActBlueCheck out other MSW Media podcastshttps://mswmedia.com/shows/Subscribe to Lawyers, Guns, And MoneyAd-free premium feed: https://lawyersgunsandmoney.supercast.comSubscribe for free everywhere else:https://lawyersgunsandmoney.simplecast.com/episodes/1-miami-1985Subscribe for free to MuellerSheWrote on Substackhttps://muellershewrote.substack.comFollow AG and Dana on Social MediaDr. Allison Gill Follow Mueller, She Wrote on Posthttps://post.news/@/MuellerSheWrote?utm_source=TwitterAG&utm_medium=creator_organic&utm_campaign=muellershewrote&utm_content=FollowMehttps://muellershewrote.substack.comhttps://twitter.com/MuellerSheWrotehttps://www.threads.net/@muellershewrotehttps://www.tiktok.com/@muellershewrotehttps://instagram.com/muellershewroteDana Goldberghttps://twitter.com/DGComedyhttps://www.instagram.com/dgcomedyhttps://www.facebook.com/dgcomedyhttps://danagoldberg.comHave some good news; a confession; or a correction to share?Good News & Confessions - The Daily Beanshttps://www.dailybeanspod.com/confessional/From The Good NewsKamala Harris (MSW Media Donation Link) — Donate via ActBlueTara Davis-Woodhall (IG Profile)Public Service Loan Forgiveness (PSLF) Live Show Ticket Links:https://allisongill.com (for all tickets and show dates)Friday August 16th Washington, DC - with Andy McCabe, Pete Strzok, Glenn Kirschner https://tinyurl.com/Beans-in-DCSaturday August 24 San Francisco, CA https://tinyurl.com/Beans-SF Listener Survey:http://survey.podtrac.com/start-survey.aspx?pubid=BffJOlI7qQcF&ver=shortFollow the Podcast on Apple:The Daily Beans on Apple PodcastsWant to support the show and get it ad-free and early?Supercasthttps://dailybeans.supercast.com/OrPatreon https://patreon.com/thedailybeansOr subscribe on Apple Podcasts with our affiliate linkThe Daily Beans on Apple Podcasts
In today's episode, we have an incredibly important and timely discussion lined up for you. Joining us is Catherine Knibbs, a renowned cyber specialist, who will help us navigate the complex and often troubling landscape of cyber trauma. Together, we'll delve into what's really happening to our children online and the profound effects of exposure to disturbing content, including pornography. We'll also address the alarming rise of child grooming and paedophilia, exploring how these threats are evolving in our digital age. This episode is a must-listen for parents, educators, and anyone concerned about the safety and well-being of young people in today's connected world. Timestamps: 00:41 - Brogan's Personal Connection to Catherine's Work 01:16 - Catherine's Background and Military Experience 03:24 - Life in the Military and Gender Dynamics 05:26 - Transition from Military to IT and Cybersecurity 06:27 - Shift to Child Psychotherapy and Personal Healing 08:58 - Intersection of Cyber and Trauma Fields 09:51 - Impact of Internet and Gaming on Children 10:51 - Origin of Cyber Trauma Concept 11:00 - Evolution of Internet and Pornography 13:36 - Children and Exposure to Disturbing Content 15:16 - Desensitization in Military and Society 17:02 - Online Safety and Generational Impact 18:03 - Child Sexual Abuse Material and Grooming 19:00 - Methods of Online Grooming 20:52 - Platforms Used for Grooming and Sextortion 23:09 - Shame and Blackmail in Sextortion 25:03 - Connection and Disconnection in Grooming 26:21 - Global Perpetrators and Grooming Manuals 27:03 - Research on Perpetrators and Their Backgrounds 29:30 - Challenges in Law Enforcement and Vigilante Groups 31:59 - Understanding the Darknet 36:05 - Vulnerabilities Leading to Online Grooming 37:53 - Importance of Media Literacy 39:07 - Tech Industry's Role and Parental Guidance 41:04 - Personal Experience with Online Scams 42:14 - Generational Vulnerability to Online Scams 43:19 - Appropriate Age for Social Media Access 45:02 - Personal Experience with Social Media and School Trauma
What is Virtual Child Sexual Abuse Material (previously known as Child Pornography)? Is sexual content in Japanese manga and anime legal? What is the impact of Artificial Intelligence (AI) on child sexual exploitation? Join the ZOE Japan team as we discuss old and new trends related to virtual CSAM. INHOPE - INHOPE has more than 50 member hotlines around the world that receive, analyze and issue take-down notices for child sexual abuse material reported by the general public. If you see any potential child sexual abuse material on the internet or social media, please use this link to find a hotline in your country and submit the URL for further analysis. Internet Hotline Center - The Internet Hotline Center (IHC) receives reports of potential child sexual abuse material from the general public, issues take down notices to platforms where the content is hosted, and transfers leads to the National Police Agency for further investigation. Their website supports English and Japanese languages.
As End-to-end encryption is becoming ever more widespread in most popular messaging services, discussions have emerged to enable the scanning of messages and files directly on end-user-devices. Carolyn Guthoff and Divyanshu Bhardwaj, usable security researchers at CISPA, have looked into these systems in their latest study on Client-Side-Scanning. They discuss potential implications such a system would bring and how Client-Side Scanning is perceived by experts, from cybersecurity researchers to law enforcement agencies. Content Warning: This episode contains mention of how Client-Side Scanning could help combat Child Sexual Abuse Material.
A new report from Stanford University's Internet Observatory reveals that the National Center for Missing and Exploited Children is ill-prepared to combat child sexual abuse material (CSAM) generated by artificial intelligence (AI). Criminals are using AI technology to create explicit images, making it difficult for authorities to identify and rescue real victims. The CyberTipline, which collects reports on CSAM, is overwhelmed by incomplete and inaccurate tips and the sheer volume of reports. The report calls for updated technology and laws to address this crime, and lawmakers are already working to criminalize the use of AI-generated explicit content. The report emphasizes the urgent need for increased funding and improved access to technology for the National Center for Missing and Exploited Children. --- Send in a voice message: https://podcasters.spotify.com/pod/show/tonyphoang/message
In this episode of Cause & Purpose, we sit down with Sarah Gardner, a trailblazer in the fight against child sexual abuse material (CSAM) through her work with the Heat Initiative. Sarah's commitment to creating a safer internet landscape for children is not just a mission — it's a calling. Under Sarah's guidance, the Heat Initiative aims to elevate internet safety standards, advocating for robust protections that shield children from the dangers lurking within the digital realm.Her current campaign is a bold move to hold industry giants like Apple accountable, urging them to fulfill their promises to identify and flag content that exploits children. This initiative sets a precedent, challenging tech companies to prioritize child safety and establish rigorous online standards. Join us as we learn a bit about Sarah's background, the significant changes she's championing, and how each of us can play a role in safeguarding the innocence of children online.
Stanford's Evelyn Douek and Alex Stamos talk to Riana Pfefferkorn and David Thiel of the Stanford Internet Observatory about the technical and legal challenges of addressing computer-generated child sexual abuse material. They mention: Riana's new paper on the topic, “Addressing Computer-Generated Child Sex Abuse Imagery: Legal Framework and Policy Implications” - Riana Pfefferkorn / LawfareDavid's report documenting Child Sexual Abuse Material in a major dataset used to train AI models - David Thiel / SIO; Samantha Cole / 404 MediaModerated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.Like what you heard? Don't forget to subscribe and share the podcast with friends!
Listen-in and hear how artificial intelligence (A.I.) is being abused to sexually exploit children. From copying the pictures of children you post on social media, to generating fake images of graphic scenes of abuse, A.I. is creating massive amounts of real and edited sexual abuse material to feed the appetites of pedophiles. Hear how ChatGPT, all social media platforms, “undressing” apps, and other forms of A.I. are using algorithms to create customized Child Sexual Abuse Material – and doing nothing to protect children from these online and digital exploitations. In Science & Love,~ Dr. Renee' Carr#AIpedophile #CSAM #ArtificialIntelligence #ChildSafety #antitrafficking #ChildExploitation #EARNITact #EARNit #aiCSAM____________________________
One of the dark sides of the rapid development of artificial intelligence and machine learning is the increase in computer-generated child pornography and other child sexual abuse material, or CG-CSAM for short. This material threatens to overwhelm the attempts of online platforms to filter for harmful content—and of prosecutors to bring those who create and disseminate CG-CSAM to justice. But it also raises complex statutory and constitutional legal issues as to what types of CG-CSAM are, and are not, legal.To explore these issues, Associate Professor of Law at the University of Minnesota and Lawfare Senior Editor Alan Rozenshtein spoke with Riana Pfefferkorn, a Research Scholar at the Stanford Internet Observatory, who has just published a new white paper in Lawfare's ongoing Digital Social Contract paper series exploring the legal and policy implications of CG-CSAM. Joining in the discussion was her colleague David Thiel, Stanford Internet Observatory's Chief Technologist, and a co-author of an important technical analysis of the recent increase in CG-CSAM.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.
The increase in self-generated child sexual abuse content is alarming. In 2022, more than three quarters (78%) of the webpages IWF identified as containing child sexual abuse material were tech-enabled, ie created via smartphones or webcams without the offender being physically present in the room with the child. As we release the Talk Trust Empower report, this episode delves into how children – many of them of primary school age – are groomed and extorted into producing self-generated imagery, how the IWF is working to raising awareness of the phenomenon and what can be done by parents and carers to help children navigate dangers online. Support the show
The National Human Rights Commission (NHRC) released a set of guidelines last month for the Central and State governments to tackle the problem of Child Sexual Abuse Material or CSAM on the internet. Over the past few years, there has been a colossal increase in the availability of CSAM online. In its 2023 report, WeProtect Global Alliance, which consists of governments, companies and charities working together for digital safety, said there was an 87% increase in such cases since 2019. What is the situation like in India? The NHRC says, that according to the Cyber Tipline 2022 statistics, of the 32 million reports received about child sexual abuse material, 5.6 million reports were uploaded by perpetrators based in India. Recently, the United Kingdom passed a stringent online safety bill that introduces a number of obligations on how large tech firms must design, operate and moderate their platforms. Other countries too are contemplating or have already put in place such measures. What are the vulnerabilities children in our country face, when they go online with their devices? How does child sexual abuse material online lead to offline consequences? Where does India stand when it comes to regulating and making the digital space safe for children?
While AI image generators can be fun, there's been rising concerns that some AI models have been abused to create child sexual abuse images. The Internet Watch Foundation, a UK-based organisation which works to identify and remove online child sexual abuse images, has been warning that AI-generated child sexual abuse images must be stopped before it overwhelms the internet.Here to help us understand the concerns behind this and how our laws can also protect children in Malaysia, is Ajeet Kaur, lawyer & co-chairperson, CRIB Foundation.Image Credit: 123RF
Child Sexual Abuse Material or CSAM is spreading like a deadly virus on the internet. While there has been a growing awareness of the problem and a concerted effort by law enforcement and technology companies to combat CSAM, a significant increase in the production, distribution, and consumption of CSAM has been seen in recent years. A report also found that the average age of victims depicted in CSAM is decreasing, with nearly half of all victims now under the age of 10. What exactly is this content? How easily accessible is it really? What do these platforms have to say? What is the government doing? And most importantly, what can a layman do to safeguard children? Dia Rekhi put ET's Aashish Aryan and Wranga's Ashish Jaiman in the hot seat and got them to explain why CSAM content on these platforms is NOT child's play. Listen to the latest episode of The Morning Brief podcast! Disclaimer: This episode has references to and descriptions of partial nudity, pornography and sexual exploitation. Listener discretion is advised. If you want to know more about Ashish Aryan's ET article, check it out here. This is the article we discussed in the episode. Highlights:05:25 till 25:36 - What is CSAM, and how have social media companies warned of strict action regarding child abuse content? By ET's Ashish Aryan.26:27 till 35:02 - How is Gen AI impacting the dissemination of CSAM content on various platforms? By Wranga's Ashish Jaiman.If you like this episode, check out other similar episodes on Business of Sleaze: The Dark Side of Livestreaming, Fatigued & Flying: Why tired pilots are a wake-up call, Cheeni Kum: The Bitter Truth of Aspartame and more! You can follow our host Dia Rekhi on her social media: Twitter & Linkedin Catch the latest episode of ‘The Morning Brief' on ET Play, The Economic Times Online, Spotify, Apple Podcasts, JioSaavn, Amazon Music and Google Podcasts. See omnystudio.com/listener for privacy information.
Today we talk about an executive from Twitter / X that publicly defends their choice to restore an account that shared explicit child sexual abuse material. It's an absolutely shocking and horrendous move to make, and -- in our opinion -- indefensible. But they manage to do try and defend their actions anyway. Read more on @Forbes here: https://www.forbes.com/sites/mattnovak/2023/08/09/twitter-exec-defends-restoring-account-that-shared-child-sex-abuse-material/ Subscribe for daily episodes. Join Discord to chat! https://discord.gg/7QsrTbKchc SOCIAL: • Peter: https://twitter.com/pgl • Jon: https://twitter.com/jonnisec • Mike: he's just unsociable • Curated privacy and security news feed: https://twitter.com/privsecnews
This episode may be difficult for some people to hear. Please ensure you are able to listen where you are not in public spaces if you need space to take a break. The content today is raw, potent and speaks to what this crime is WITHOUT DETAILS. However; this may be difficult to listen to. We must start having these conversations outside of law enforcement, without blame, shame and too much detail and we need to educate with compassion and not anger. Sergio explains why he is committed to working in this field, currently without wages because this is a profession laden with necessity and drive. What we discuss today is the reality of many years work in this field for both of us and a topic I want to speak to more loudly than ever before. Sergio is a fantastic guest today and discusses what we really need to do going forward. Big tech and society, together:
In this episode of Serious Privacy, Paul Breitbarth of Catawiki and Dr. K Royal take you on a fast-paced global tour for privacy developments, including Oregon sending a privacy bill to the governor, Texas passing a privacy law, Connecticut enhancing its privacy law, Apple's announcement on Child Sexual Abuse Material, the parliamentary monsoon session in India, and two items out of the Swedish DPA - a fine on Spotify and Bonnier News. Also, we briefly touch on the 10th anniversary of the Snowden leaks. As always, if you have comments or questions, let us know - LinkedIn, Twitter @podcastprivacy @euroPaulB @heartofprivacy and email podcast@seriousprivacy.eu. Please do like and write comments on your favorite podcast app so other professionals can find us easier. As always, if you have comments or questions, find us on LinkedIn, Twitter @podcastprivacy @euroPaulB @heartofprivacy and email podcast@seriousprivacy.eu. Rate and Review us! #heartofprivacy #seriousprivacy #privacy #dataprotection #cybersecuritylaw #CPO #DPO
This week two Lantern Rescue Operators share about the recent conference in Africa. Learn what we are up against in Africa and how eight countries and multiple partners are coming together to combat Human trafficking, Child Sacrifice, and Child Sexual Abuse Material. A warning: this program contains sensitive content. Listener discretion is advised. Call the National Human Trafficking Hotline (NHTH) at 1-888-373-7888. Learn more at https://lanternrescue.org
Stanford's Evelyn Douek and Alex Stamos are joined by Stanford Internet Observatory (SIO) Research Manager Renée DiResta and Chief Technologist David Thiel to discuss a new report on a months-long investigation into the distribution of illicit sexual content by minors online.Large Networks of Minors Appear to be Selling Illicit Sexual Content OnlineThe Stanford Internet Observatory (SIO) published a report last week with findings from a months-long investigation into the distribution of illicit sexual content by minors online. The SIO research team identified a large network of accounts claiming to be minors, likely teenagers, who are producing, marketing and selling their own explicit content on social media.A tip from The Wall Street Journal informed the investigation with a list of common terms and hashtags indicating the sale of “self-generated child sexual abuse material” (SG-CSAM). SIO identified a network of more than 500 accounts advertising SG-CSAM with tens of thousands of likely buyers.With only public data, this research uncovered and helped resolve basic safety failings with Instagram's reporting system for accounts with expected child exploitation, and Twitter's system for automatically detecting and removing known CSAM. Most of the work to address CSAM has focused on adult offenders who create the majority of content. These findings highlight the need for new countermeasures developed by industry, law enforcement and policymakers to address sextortion and the sale of illicit content that minors create themselves.Front-Page Wall Street Journal CoverageA Wall Street Journal article first covered Twitter's lapse in safety measures to prevent known CSAM from appearing on the site and the importance of researcher access to study public social media data to identify and help address issues. - Alexa Corse/ The Wall Street JournalInstagram was the focus of a larger Wall Street Journal investigation, based in part on SIO's research findings. The app is currently the most significant platform for these CSAM networks, connecting young sellers with buyers with recommendation features, searching for hashtags, and direct messaging. - Jeff Horwitz, Katherine Blunt/ The Wall Street JournalBipartisan Concern and Calls for Social Media Regulation The investigation sparked outrage across the aisle in the U.S. and grabbed the attention of the European Commission as the European Union prepares to enforce the Digital Services Act for the largest online platforms later this summer.Thierry Breton, the top EU official for trade and industry regulation, announced that he will meet with Meta CEO Mark Zuckerberg later this month at the company's Menlo Park headquarters to discuss the report and demand the company takes action.In Congress, House Energy and Commerce Democrats and GOP Senators were most outspoken about taking action to address the concerning findings.Senate Judiciary Ranking Member Lindsey Graham (R-SC) suggested a hearing on the findings during a Senate Judiciary markup session.Sen. Tom Cotton (R-AK) @SenTomCotton: “Social media isn't safe for kids. At a minimum, we should require age verification and parental consent.”Sen. Rick Scott (@SenRickScott): “Every parent should read this story. Social media is NOT SAFE for our kids. What is described here is disgusting and needs to be shut down now!”House Energy and Commerce Committee Democrats released statements that they were “appalled” and “disgusted” by the role Instagram plays in connecting minors with buyers for abuse content. - Office of Congressman Frank Pallone, Office of Congresswoman Jan SchakowskyRep. Ken Buck (@RepKenBuck): “How do we expect Big Tech companies like @Meta to regulate themselves when they allow vast networks of pedophiles to operate freely? #pedogramRep. Anna Paulina Luna (@RepLuna): “Instead of meddling in elections, it would be cool if Mark Zuckerburg spent a few Zuckerbucks on cleaning up the Pedogram network.”Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.Like what you heard? Don't forget to subscribe and share the podcast with friends!
Evelyn and Alex talk to David Thiel and Renée DiResta, Alex's two co-authors on a report released by the Stanford Internet Observatory last week with findings from an investigation into the distribution of illicit sexual content by minors online. They talk about the findings, how social media companies should be and have been responding, and the public and political response to the report.
This podcast is a commentary and does not contain any copyrighted material of the reference source. We strongly recommend accessing/buying the reference source at the same time. ■Reference Source https://www.ted.com/talks/julie_cordua_how_we_can_eliminate_child_sexual_abuse_material_from_the_internet ■Post on this topic (You can get FREE learning materials!) https://englist.me/95-academic-words-reference-from-julie-cordua-how-we-can-eliminate-child-sexual-abuse-material-from-the-internet-ted-talk/ ■Youtube Video https://youtu.be/M0siBlOVkgk (All Words) https://youtu.be/sIGn3YVLP6M (Advanced Words) https://youtu.be/397DuypeJ2o (Quick Look) ■Top Page for Further Materials https://englist.me/ ■SNS (Please follow!)
The European Union's Internal Market commissioner Thierry Breton has called on Meta CEO Mark Zuckerberg to take "immediate action" or face heavy fines after a report this week revealed Instagram to be the worst social network for child sexual abuse material.
P.M. Edition for June 5. Researchers at Stanford found that Twitter failed to prevent known images of child sexual abuse from being posted on its platform. Twitter told researchers it has since improved its detection system. Twitter reporter Alexa Corse talks about the challenges of eliminating such content from social media. Plus, American Airlines wants to reinvent business travel. Airlines and airline travel reporter Alison Sider explains what it will mean for airfare. Annmarie Fertoli hosts. Learn more about your ad choices. Visit megaphone.fm/adchoices
The Internet Watch Foundation (IWF) is an organization based in the United Kingdom that works to combat the distribution of online child sexual abuse material (CSAM) and other illegal content. Established in 1996, the IWF operates as an independent, self-regulatory body supported by the internet industry, law enforcement agencies, and the government.The main objective of the IWF is to minimize the availability of CSAM on the internet and protect children from online exploitation. They achieve this by actively searching for and removing illegal content hosted on websites worldwide. When the IWF identifies explicit or abusive material, they work closely with internet service providers (ISPs) to have the content blocked or taken down.The IWF also operates a hotline that allows members of the public to report suspected instances of CSAM found online. These reports are assessed by the IWF's analysts, who then take appropriate action to have the illegal content removed and ensure that the relevant authorities are notified.Additionally, the IWF plays a vital role in providing support and guidance to internet industry partners, helping them establish policies and practices to prevent the distribution of illegal content on their platforms. They collaborate with law enforcement agencies both nationally and internationally, sharing information and intelligence to aid in investigations and prosecution of offenders.Learn more about the foundation here.Read the blog post I review in the episode here.This episode was brought to you by CONSENTparenting. Learn more about CONSENTparenting here.TIME STAMPS:Introduction of the episode. (0:00)The Internet Watch Foundation's report. (2:02)What are the categories of indecent images? (3:44)Explaining the definition of child sexual abuse. (6:08)The results of the investigation. (10:44)What objects were being used for penetration? (12:03)A global crisis of our era. (14:06)Is child sexual abuse on the rise? (18:02)The rise of online abuse. (19:57)What is online safety? (22:15)
Today's podcast episode is a conversation surrounding the sensitive and important topic of childhood sexual exploitation. Please practise listener discretion. This conversation is not suitable for all and themes of exploration, abuse and domestic violence are mentioned. Conrad is currently Principal Advisor on Child Sexual Exploitation with IFYS and oversees Project Paradigm, a national programme aimed at addressing child sexual exploitation in Australia. Alongside this he supports professional practice through external professional supervisor, consultancy and training on child protection related issues. He sits on the board of the Australian Community Workers Association, is chair of the National Strategic Partnership on Child Sexual Exploitation, a member of the National Working Group on Responses to Victims of Child Sexual Abuse Material, a member of the National Stakeholder Working Group for the Australian Centre to Counter Child Exploitation, and occasional peer reviewer for the Journal of Sexual Aggression.Donate to the cause here For more from the I am. with Kylie Lately podcast - more personal reflections, more conversations with guests, & more juicy self-development inspiration, you can join our members only platform here... https://plus.acast.com/s/the-kylie-camps-podcast. Hosted on Acast. See acast.com/privacy for more information.
Government threats to end-to-end encryption—the technology that secures your messages and shared photos and videos—have been around for decades, but the most recent threats to this technology are unique in how they intersect with a broader, sometimes-global effort to control information on the Internet. Take two efforts in the European Union and the United Kingdom. New proposals there would require companies to scan any content that their users share with one another for Child Sexual Abuse Material, or CSAM. If a company offers end-to-end encryption to its users, effectively locking the company itself out of being able to access the content that its users share, then it's tough luck for those companies. They will still be required to find a way to essentially do the impossible—build a system that keeps everyone else out, while letting themselves and the government in. While these government proposals may sound similar to previous global efforts to weaken end-to-end encryption in the past, like the United States' prolonged attempt to tarnish end-to-end encryption by linking it to terrorist plots, they differ because of how easily they could become tools for censorship. Today, on the Lock and Code podcast with host David Ruiz, we speak with Mallory Knodel, chief technology officer for Center for Democracy and Technology, about new threats to encryption, old and bad repeated proposals, who encryption benefits (everyone), and how building a tool to detect one legitimate harm could, in turn, create a tool to detect all sorts of legal content that other governments simply do not like. "In many places of the world where there's not such a strong feeling about individual and personal privacy, sometimes that is replaced by an inability to access mainstream media, news, accurate information, and so on, because there's a heavy censorship regime in place," Knodel said. "And I think that drawing that line between 'You're going to censor child sexual abuse material, which is illegal and disgusting and we want it to go away,' but it's so very easy to slide that knob over into 'Now you're also gonna block disinformation,' and you might at some point, take it a step further and block other kinds of content, too, and you just continue down that path." Knodel continued: "Then you do have a pretty easy way of mass-censoring certain kinds of content from the Internet that probably shouldn't be censored." Tune in today. You can also find us on Apple Podcasts, Spotify, and Google Podcasts, plus whatever preferred podcast platform you use. Show notes and credits: Intro Music: “Spellbound” by Kevin MacLeod (incompetech.com) Licensed under Creative Commons: By Attribution 4.0 License http://creativecommons.org/licenses/by/4.0/ Outro Music: “Good God” by Wowa (unminus.com)
There is more child sexual abuse content online than ever before. And not just on the dark web, but on platforms like Facebook, Twitter, and Snapchat. What can the major tech companies do to stop it?
Apple decides not to scan your iCloud photos against the Child Sexual Abuse Material database, or, CSAM, and has other plans to expand its Communication Safety features.The Federal Trade Commission filed an antitrust complaint against Microsoft alleging that its bid to acquire Activision Blizzard violates U.S. law.Meta's content moderation policy for Facebook and Instagram had a double standard for high-profile users compared to regular folks.And the likes of Meta and Microsoft contemplate trying to gain Twitter market share with Twitter-like and WeChat-like super apps respectively.Link to Show Notes Hosted on Acast. See acast.com/privacy for more information.
Elon Musk's presence has loomed over Twitter since he announced plans to purchase the platform. And for these few weeks that he's been in charge, many concerns have proven to be justified. Musk laid off 3,700 employees, and then 4,400 contractors. He is firing those who are critical of him. The verification process, perhaps one of Twitter's most trusted features, has been unraveled. He's offered severance to those who don't want to be part of “extremely hardcore” Twitter. Following the results of a Twitter poll, he reinstated the account of Donald Trump, who was suspended from the platform for his role in inciting the January 6th attacks. So, what happens now? What of the many social movements that manifested on Twitter? While some movements and followings may see new manifestations on other platforms, not everything will be completely recreated. For example, as writer Jason Parham explains, “whatever the destination, Black Twitter will be increasingly difficult to recreate.” In this episode of Community Signal, Patrick speaks to three experts: Sarah T. Roberts, associate professor in the Department of Information Studies at UCLA, trust and safety consultant Ralph Spencer, and Omar Wasow, assistant professor in UC Berkeley's Department of Political Science and co-founder of BlackPlanet, about the current state and future of Twitter. They dissect the realities facing the platform today including content moderation, loss of institutional knowledge, and uncertainty about Twitter's infrastructure, but also emphasize the importance of Twitter as a social utility for news and more. This episode also touches on: The reality of moderating a platform like Twitter What platforms actually mean when they say they're for “free speech” How Musk tanked the value of verification on Twitter Big Quotes On the future of content moderation at Twitter (8:28): “There's no way possible with the cuts [Musk has] made that he's going to be able to do any type of content moderation. … [He] isn't going to have anybody who remotely begins to know to how to do that [legal compliance and related work].” –Ralph Spencer Sarah T. Roberts' moderation challenge for Elon Musk (11:19): “I want Elon Musk to spend one day as a frontline production content moderator, and then get back to this [Community Signal] crew about how that went. Let us know what you saw. Share with us how easy it was to stomach that. Were you able to keep up with the expected pace at Twitter? Could you … make good decisions over 90% of the time, over 1,000, 2,000 times a day? Could you do that all the while seeing animals being harmed, kids being beat on, [and] child sexual exploitation material?” –@ubiquity75 Bumper sticker wisdom doesn't make good policy (15:46): “Everything [Musk has said about free speech] has had the quality of good bumper stickers but is totally divorced from reality, and that doesn't bode well, obviously.” –@owasow The responsibility in leading a social media platform (19:41): “One thing that we are seeing in real-time [at Twitter] is what a danger there is in having one individual – especially a very privileged individual who does not live in the same social milieu as almost anyone else in the world – one very privileged individual's ability to be the arbiter of … these profoundly contested ideological notions of something like free speech which again is continually misapplied in this realm.” –@ubiquity75 Musk's peddling of conspiracy theories (20:29): “[Musk is] running around tweeting that story about Nancy Pelosi's husband, the false article about what happened between him and his attacker. What kind of example is that to set? … What it is to me is like this kid who has way too much money, and he found a new toy he wants to play with.” –Ralph Spencer Leading with humility (21:23): “[If you're running a site like Twitter,] you have to have a ‘small d' democratic personality, which is to say you really have to be comfortable with a thousand voices flourishing, a lot of them being critical of you, and that's not something that you take personally.” –@owasow There are always limits on speech (23:50): “When you declare that your product, your site, your platform, your service is a free speech zone, there is always going to be a limit on that speech. … [CSAM] is the most extreme example that we can come up with, but that is content moderation. To remove that material, to disallow it, to enforce the law means that there is a limit on speech, and there ought to be in that case. If there's a limit on speech, it is by definition not a free speech site. Then we have to ask, well, what are the limits, and who do they serve?” –@ubiquity75 “Free speech” platforms are not a thing (25:25): “When I hear people invoke free speech on a for-profit social media site, not only does that not exist today, it never has existed, and it never will exist. Let's deal with what reality is actually giving us and talk about that instead of these fantasies that actually are pretty much not good for anyone involved.” –@ubiquity75 The social weight and trust that verification brought to interactions on Twitter (32:52): “[Twitter] has outsized social impact, whether it's in the political arena, whether it's in social movements, whether it's in celebrity usage, all of these things have been true. In terms of political movements, the good, bad, the ugly. We saw an insurrection against the United States launched by the President of the United States on Twitter, so it's not all rosy, but the point is that Twitter had this outsized power and part of that could be attributed … to this verification process that let a lot of high profile folks, prominent individuals, media organizations, other kinds of people in the zeitgeist or in the public eye, engage with a certain sense of security.” –@ubiquity75 How does Twitter sustain its infrastructure amidst the mass layoffs and resignations? (39:18): “We have good reason to fear that [Twitter's] infrastructure is going to get considerably worse over time. [Musk has] fired enough of the people. … In a lot of ways, [Twitter is] like a telephone company. It's got a lot of boring infrastructure that it has to maintain so that it's reliable. [Musk has] taken a bunch of these pillars or blocks in the Jenga stack and knocked them out, and it's a lot more wobbly now.” –@owasow Musk's Twitter user experience is not the common one (48:23): “[Musk is] obsessed with bots and spam, but why is that such a compulsion for him? Well, he has 100-plus million followers, and when he looks at his replies, there's probably a lot of bots and spam there. That's not where I live because I'm a civilian. His perspective is distorted in a way partly by the investment around him but partly also by just being so way out of proportion to almost any other human on Earth.” –@owasow About Our Guests Omar Wasow is an assistant professor in UC Berkeley's Department of Political Science. His research focuses on race, politics, and statistical methods. Previously, Omar co-founded BlackPlanet, an early leading social network, and was a regular technology analyst on radio and television. He received a PhD in African American Studies, an MA in government, and an MA in statistics from Harvard University. Ralph Spencer has been working to make online spaces safer for more than 20 years, starting with his time as a club editorial specialist (message board editor) at Prodigy, and then graduating to America Online. During his time at AOL, he was in charge of all issues involving Child Sexual Abuse Material or CSAM. The evidence that Ralph and the team he worked with in AOL's legal department compiled contributed to numerous arrests and convictions of individuals for the possession and distribution of CSAM. He currently works as a freelance trust and safety consultant. Sarah T. Roberts is an associate professor in the Department of Information Studies at UCLA. She holds a PhD from the iSchool at the University of Illinois at Urbana-Champaign. Her book on commercial content moderation, Behind the Screen, was released in 2019 from the Yale University Press. She served as a consultant, too, and is featured in the award-winning documentary The Cleaners. Dr. Roberts sits on the board of the IEEE Annals of the History of Computing, was a 2018 Carnegie Fellow, and a 2018 recipient of the EFF Barlow Pioneer Award for her groundbreaking research on content moderation of social media. Related Links Elon Musk takes control of Twitter and immediately ousts top executives (via NPR) Omar Wasow's website Omar Wasow on Twitter BlackPlanet.com, founded by Wasow Ralph Spencer on LinkedIn Sarah T. Roberts' website Sarah T. Roberts on Twitter Behind the Screen: Content Moderation in the Shadows of Social Media, by Sarah T. Roberts Note from Patrick: After 5 years, this is Carol's final episode as editorial lead on Community Signal. We'll miss you, Carol! The Twitter Rules Code and Other Laws of Cyberspace by Lawrence Lessig Elon Musk says Twitter will have a ‘content moderation council' (via The Verge) Democratic U.S. senators accuse Musk of undermining Twitter, urge FTC probe (via Reuters) We got Twitter ‘verified' in minutes posing as a comedian and a senator (via The Washington Post) How Much Did Twitter's Verification Chaos Cost Insulin Maker Eli Lilly and Twitter Itself? (via Gizmodo) Patrick's (somewhat sarcastic) Twitter thread about the policies he hoped the platform would put in place to address Musk's conflicts of interest Saturday Night Live's content moderation council sketch Transcript View on our website Your Thoughts If you have any thoughts on this episode that you'd like to share, please leave me a comment, send me an email or a tweet. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.
Stories in this episode: Day in History: 1922: Growing need for music teachers Multiple agencies come together to fight child sexual abuse material 7 updates provided during Rochester City Council's first-ever bus tour Harmony Telephone Company bringing fiber internet to Harmony amid federal, state push for broadband expansion Mayo grad Goetz gets shot to pitch in Northwoods League The Post Bulletin is proud to be a part of the Trust Project. Learn more at thetrustproject.org.
We speak to Garda Detective Sergeant Mick Moran
Child sexual abuse material (CSAM), previously known as child pornography, can be a confronting and uncomfortable topic. CSAM can refer to the possession, viewing, sharing, and creation of images or videos that involve the visual depiction of children involved in a sex act. Although CSAM was almost completely eradicated in the 1980s, the dawn of the Internet ushered the proliferation of it. The anonymity of the Internet and the ease of sharing digital images of children makes this material ‘one click away'. During this episode, we speak to Nicholas a person who was convicted of and served prison time for the possession of child sexual abuse material. Importantly, Nicholas emphasizes the fact that although his committed a non-contact offense, his crime was not victim-less. His description of his life before, during and after the offense provides critical insights regarding the factors that can lead someone to consuming CSAM, the importance and effectiveness of treatment, the challenges of life after prison for someone convicted of an act of sexual harm, and many other topics that we have covered during previous episodes. We understand that hearing Nicholas' story can be confronting and uncomfortable, but we think his account provides important information that can be used to combat the proliferation of child sexual abuse material in the future. Additional Readings and Resources: Child Pornography – The United States Department of Justice Citizen's Guide to U.S. Federal Law on Child Pornography – The United States Department of Justice Child Pornography Offenders: Quick Facts – United States Sentencing Commission Learn more about your ad choices. Visit megaphone.fm/adchoices
Homeland Security Investigations (HSI), the investigative unit of Immigration and Customs Enforcement (ICE), announced it initiated 4,224 child exploitation cases during the 2019 fiscal year, which began in October 2018 and concluded at the end of September. Those cases led to a total of 3,771 criminal arrests, and the identification or rescue of 1,066 victims. In this episode, The Pivot hosts and Maltego Subject Matter Experts Joe Ryan and Mario Rojas dive into the alarming topic: Child Sexual Abuse Material (CSAM) and how investigators and organizations can support the fight against it. They touch upon the following topics: 1. What technologies are involved in the production, identification, and detection of CSAM 2. Recent headlines that have to do with CSAM 3. How CSAM is investigated 4. What other types of CSAM are out there and where we can find information or reports about CSAM 5. How to prevent the spread of CSAM or get involved in the fight against CSAM ■ About The Pivot Brought to you by Maltego, The Pivot deep dives into topics pivoting from information security to the criminal underground. Each episode features interviews with experts from the industry and research fields and explores how they connect the dots. ■ About Maltego Used by investigators worldwide, Maltego is an graphical link analysis tool that allows users to mine, merge, and map data from OSINT and third-party data integrations for all sorts of investigations—cybersecurity, person of interest, fraud, and more. The podcast streams free on Spotify. You can also watch it all go down on YouTube. Don't forget to subscribe to our Twitter and LinkedIn to stay on top of our latest updates, tutorials, webinars, and deep dives. For more information about Maltego, visit our website.
In 2013, Jason Weis and Kristin Weis founded The Demand Project to address the multifaceted fight against human trafficking and the commercial sexual exploitation of children. In the episode, Kristin addresses the why behind the what, as well as answers questions on what to look for, the importance of communication, understanding predators and victims and the importance of talking about sex in the home. Kristin also breaksdown the truth behind human trafficking, online enticement, child sexual abuse material and commercial sexual exploitation of children and how you and I can become advocates for those who do not have a voice. For more on The Demand Project and how you can help fight to help end human trafficking: https://www.thedemandproject.org/
Episode Description: Mojo Vision has developed a new prototype of its augmented reality contact lenses. Biden warned that the Russian government could use cyberattacks to escalate the crisis. According to reports, Facebook might be underreporting child sexual abuse material. BONUS EPISODES Patreon: ✨www.patreon.com/latinamericaneo✨
Vi startar om podden och välkomnar Niclas Borgström tillbaka till studion och gästas den här gången av Anna Borgström som till vardags är CEO på NetClean. En av VMwares viktigaste samarbetspartners som viker sina dagar till att stoppa CSAM, (Child Sexual Abuse Material). Lyssna in och se hur du skall vara med att se till att så få barn som möjligt blir utnyttjade och hjälp till att stoppa förövarna. Ett jobbigt ämne men ack så viktigt.
Pornography has become a scourge in the United States. Today on Humanize, Wesley J. Smith speaks with internet safety and anti-obscenity activist Donna Rice Hughes. Donna and Wesley have a mature discussion about one of the great affronts to human dignity, which is pornography. Donna discusses the impact porn has on children—both as victims of child pornographers and as consumers Read More ›
***TRIGGER WARNING*** This episode contains difficult content about child sexual abuse material (CSAM) that is likely to be particularly sensitive. In 2019 nearly 30 million images and over 40 million videos relating to child sexual exploitation were released online... "The most surprising part is that it's not hidden, it's on everyday tools that you and I use, it's on Google and Facebook and Microsoft, they are hiding in plain sight." Today's podcast is about a particularly difficult and sensitive topic that no one really wants to talk about, or even really acknowledge exists. But, child sexual abuse material (CSAM) sadly does exist and my guest today plays a fundamental role in helping to improve content moderation and law enforcement to more effectively protect children online. Chris Wexler is one of the founders and CEO of Krunam, a technology company that builds tools to make it easier to mitigate and stop the inhumanity of child sexual abuse, while also freeing up police time to focus more on investigating and catching the perpetrators. The company's technology is 10x more effective than PhotoDNA (the current industry-leading technology) and can find previously unknown CSAM in images, video, and very soon, live streaming, for the first time at scale. “In real-world testing, our classifier outperformed human classification because, after 10 minutes of doing this, you just get exhausted,” Chris explains. “It's a brutal, brutal way to live and work. We believe [our technology] is more humane, for not only the victims of these crimes but the workers that are fighting these crimes do not have to spend so much time with it.” We also talk about: The story behind KrunamWhat businesses can do to help stop the spread of CSAMSocial enterprises and ESG as the future of business You can follow Chris on Twitter @ChrisWexler. About The Nicole Bremner Podcast:Nicole Bremner is an investor, speaker, writer, and podcaster. After a successful decade building a multi-million property portfolio in London, Nicole was forced by a number of external obstacles to stop, take stock and figure out what really matters in life. Following a period of healing and reflection, she discovered that what doesn't kill you makes you stronger and, so very often, setback is followed by real success. On The Nicole Bremner Podcast, she speaks to others who have triumphed in the face of adversity and explores the lessons they have learned along the way. To find out who's coming up next on The Nicole Bremner Podcast, follow Nicole on Instagram @nsbremner and facebook or subscribe to her YouTube channel. You can also support the show here.DisclaimerThe views and opinions expressed in this podcast belong solely to the host and guest speakers. The view and opinions of the guest speakers do not represent that of the host. Always do your own research. Support the show (https://www.buymeacoffee.com/NicoleBremner)Support the show (https://www.buymeacoffee.com/NicoleBremner)
Futurized goes beneath the trends to track the underlying forces of disruption in tech, policy, business models, social dynamics and the environment. I'm your host, Trond Arne Undheim, futurist and author. In episode #105 of the podcast, the topic is: The Future of Child Abuse Online. Our guest is Chris Wexler, CEO and co-founder of Krunam. In this conversation, we talk about the business of removing digital toxic waste from the internet using AI to identify Child Sexual Abuse Material (CSAM). The host of this podcast, Trond Arne Undheim, Ph.D is the author of Health Tech: Rebooting Society's Software, Hardware and Mindset--published by Routledge in 2021, Future Tech: How to Capture Value from Disruptive industry Trends--published by Kogan Page in 2021, Pandemic Aftermath: how Coronavirus changes Global Society and Disruption Games: How to Thrive on Serial Failure (2020)--both published by Atmosphere Press in 2020, Leadership From Below: How the Internet Generation Redefines the Workplace by Lulu Press in 2008. For an overview, go to Trond's Books at Trondundheim.com/books At this stage, Futurized is lucky enough to have several sponsors. To check them out, go to Sponsors | Futurized - thoughts on our emerging future. If you are interested in sponsoring the podcast, or to get an overview of other services provided by the host of this podcast, including how to book him for keynote speeches, please go to Store | Futurized - thoughts on our emerging future. We will consider all brands that have a demonstrably positive contribution to the future. Before you do anything else, make sure you are subscribed to our newsletter on Futurized.org, where you can find hundreds of episodes of conversations that matter to the future. I hope you can also leave a positive review on iTunes or in your favorite podcast player--it really matters to the future of this podcast. Thanks so much, let's begin. After listening to the episode, check out: Chris Wexler (@ChrisWexler): https://www.linkedin.com/in/chriswexler/ Krunam: https://krunam.co/ My takeaway is that Child Abuse Online is a growing problem and there is no simple technology fix. We are dealing with those who push the limits of pornography, one of the most adaptive applications on the internet. Keeping a watchful eye and reporting abuses as we come across them, seems like a sensible approach, supporting businesses such as Krunam, who use AI to fight it, also makes sense. It is encouraging that there now are clever people, technologies, and organizations helping law enforcement with this endemic problem. Regulating the players whose business models touches this area more vigorously would also help. Thanks for listening. If you liked the show, subscribe at Futurized.org or in your preferred podcast player, and rate us with five stars. If you like this topic, you may enjoy other episodes of Futurized, such as episode 28,The Future of Child Trafficking | Futurized - thoughts on our emerging future, episode 96, Practicing Multimodal AI, or episode 16, The Future of Human Perception AI. To find us on social media: Instagram: https://www.instagram.com/futurized2/ Twitter (@Futurized2): https://twitter.com/Futurized2 Facebook: https://www.facebook.com/Futurized-102998138625787 LinkedIn: https://www.linkedin.com/company/futurized YouTube: https://www.youtube.com/Futurized Podcast RSS: https://feed.podbean.com/www.futurized.co/feed.xml Futurized—conversations that matter.
In this episode, Dr Rick Brown interviews AIC researchers Sarah Napier and Coen Teunissen to discuss their research on online child sexual abuse material. Napier and Teunissen examine the nature and extent of online child sexual abuse and highlight ways to prevent and disrupt this horrific crime. Content warning: this episode discusses topics of abuse that may be triggering for some listeners and is not suitable for children.
Employers added 235,000 jobs last month, fewer than expected; the number of long-term unemployed fell last month; Apple delays a plan to scan users' pictures for signs of “Child Sexual Abuse Material;” the U.N. says COVID lockdowns last year led to a temporary decrease in air pollution
We discuss Apple's planned changes coming in iOS 15 to address Child Sexual Abuse Material on their devices and if a corporation should really be the one with this power Twitter: https://twitter.com/at_randompod Instagram: https://www.instagram.com/at.random.podcast/ Youtube: https://www.youtube.com/channel/UC0CpGU3dqJ4LLREYj-eovrg Discord Server: https://discord.gg/xwBWtkW
Apple new tool for tackling Child Sexual Abuse Material received a fair bit of praise when the Cupertino-based tech giant unveiled it. But one could sense that most were just damning it with faint praise. Soon, the scepticism translated into warning shots being fired by some of the world's leading privacy researchers and tech leaders. In this episode of the YouthCast Podcast, the hosts discuss the rollout of the controversial new feature and discuss its implications on privacy.
We took a bit of a summer break, but now we're back! Big shoutout to our subscribers who've been waiting for the new episode drop.In this episode we talk about the recently concluded Olympics. We also talk about the very controversial decision from Apple, Inc. to essentially scan devices for images that could contain child pornography.Premier League is back! And Arsenal remains the banter club for us all. Suicide Squad was really fun and we talked about some of the characters and the difference between DC and Marvel with projects like these.Finally, we touched on the horrifying situation unraveling in Afghanistan and how devastating it is. I blanked on what the acronym CSAM stood for and it's Child Sexual Abuse Material.Related links on what we talked about:https://olympics.com/tokyo-2020/olympic-games/en/results/all-sports/medal-standings.htm: Tokyo Olympics medal counthttps://youtu.be/G05nEgsXgoI: Wall Street Journal Interviewing Apple's software chief Craig Federighihttps://ourworldindata.org/covid-vaccinations?country=OWID_WRL: Tracking Covid-19 vaccinations
Apple recently announced it will begin reporting Child Sexual Abuse Material (CSAM) to law enforcement with the latest iOS 15 update. The new system aims to identify images using a process called hashing, which turns images into numbers. On this episode, we discuss how Apple's new system will work and how this bold step in combating Child Sexual Abuse is being received by privacy-sensitive users around the world.Links:Apple to combat Child Sexual Abuse Material: https://www.cnbc.com/2021/08/05/apple-will-report-child-sexual-abuse-images-on-icloud-to-law.html National Center for Missing Exploited Children (NCMEC): Home (missingkids.org) Internet Watch Foundation (IWF): Homepage | Internet Watch Foundation (iwf.org.uk)
Free $75 credit to upgrade your post at Indeed.com/APPLEBITZ. Terms and conditions apply. Offer valid through Sept. 30, 2021. Will new iPhone 13 camera leaks make the iPhone 13/13 Pro more upgrade worth to you? Apple's stance on Privacy might have shifted with their new efforts to crackdown on Child Sexual Abuse Material. What does it mean for you? You can help support this show and my independent work at www.patreon.com/briantong THANK YOU! Call into the show by recording a Voice Memo and send to applebitzshow@gmail.com
In Folge 68 sprechen die ApfelNerds über das Thema Vermeidung von Kindesmissbrauch, wobei Apple mehrere neue Features zu dem Thema angekündigt hat. Unter anderem die sehr kontrovers diskutierte Erkennung von Kindesmissbrauchsbildern (CSAM, Child Sexual Abuse Material) in iCloud Photos. Außerdem sehen Apple-Ingenieure mit Sorge auf Apples Wohnzimmerstrategie, die Serienproduktion der MacBook Pros läuft an, das iPhone-Event soll wieder zum üblichen September-Rhythmus zurückkehren, iPhones 13 sollen Echtzeit-Portrait-Mode für Videos und ProRes-Unterstützung bekommen, ein MacBook Air-Refresh mit MiniLED-Display soll Mitte 2022 kommen, erste Hinweise auf iOS 14.8 wurden gesichtet, Die fünfte Beta der kommenden Betriebssysteme sind da, macOS Big Sur 11.5.2 verfügbar, Washington, DC schenkt Schüler AirPods wenn sie sich impfen lassen, Kevin Lynch soll zum Project Titan gewechselt sein, Ikea stellt HomeKit-kompatiblen Luftfilter vir, Spotify und der AirPlay 2-Support, Parallels Destop 17 ist verfügbar, Umfrage zeigt Kundenwünsche für das iPhone 13 und Microsoft stellt Windows 365 vor.
After Apple detailed its plans for protecting against Child Sexual Abuse Material (CSAM), many questioned the impact such a decision could have. The U.S. is set to pass a new infrastructure bill that features a cryptocurrency tax provision. Researchers have created what they call a set of "master faces" to bypass facial recognition. Samsung announced a slew of foldable phones at its Unpacked event. First, Rene Ritchie joins Tech News Weekly to provide an update on Apple's recently announced plans for finding Child Sexual Abuse Material on its platform. Ritchie explains the technology, details some of the criticisms, and discusses the pros and cons of end-to-end encryption. Then, Nikhilesh De of CoinDesk stops by to discuss a provision in the U.S.'s new infrastructure bill. The bill, which is set to pass, features a provision regarding a cryptocurrency tax, but some argue the language surrounding the tax is broad and lacks clarity. Then, Mikah shares a story from Vice about a study in which researchers created a series of "master faces" that could be used to bypass facial recognition technology. Lastly, Jason and Mikah discuss the Samsung Unpacked event. Jason shares his thoughts on the latest announcements and the two ponder the utility of foldable phones and devices. Hosts: Jason Howell and Mikah Sargent Guests: Rene Ritchie and Nikhilesh De Download or subscribe to this show at https://twit.tv/shows/tech-news-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: itpro.tv/tnw promo code TNW30 akamai.com/tnw
After Apple detailed its plans for protecting against Child Sexual Abuse Material (CSAM), many questioned the impact such a decision could have. The U.S. is set to pass a new infrastructure bill that features a cryptocurrency tax provision. Researchers have created what they call a set of "master faces" to bypass facial recognition. Samsung announced a slew of foldable phones at its Unpacked event. First, Rene Ritchie joins Tech News Weekly to provide an update on Apple's recently announced plans for finding Child Sexual Abuse Material on its platform. Ritchie explains the technology, details some of the criticisms, and discusses the pros and cons of end-to-end encryption. Then, Nikhilesh De of CoinDesk stops by to discuss a provision in the U.S.'s new infrastructure bill. The bill, which is set to pass, features a provision regarding a cryptocurrency tax, but some argue the language surrounding the tax is broad and lacks clarity. Then, Mikah shares a story from Vice about a study in which researchers created a series of "master faces" that could be used to bypass facial recognition technology. Lastly, Jason and Mikah discuss the Samsung Unpacked event. Jason shares his thoughts on the latest announcements and the two ponder the utility of foldable phones and devices. Hosts: Jason Howell and Mikah Sargent Guests: Rene Ritchie and Nikhilesh De Download or subscribe to this show at https://twit.tv/shows/tech-news-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: itpro.tv/tnw promo code TNW30 akamai.com/tnw
After Apple detailed its plans for protecting against Child Sexual Abuse Material (CSAM), many questioned the impact such a decision could have. The U.S. is set to pass a new infrastructure bill that features a cryptocurrency tax provision. Researchers have created what they call a set of "master faces" to bypass facial recognition. Samsung announced a slew of foldable phones at its Unpacked event. First, Rene Ritchie joins Tech News Weekly to provide an update on Apple's recently announced plans for finding Child Sexual Abuse Material on its platform. Ritchie explains the technology, details some of the criticisms, and discusses the pros and cons of end-to-end encryption. Then, Nikhilesh De of CoinDesk stops by to discuss a provision in the U.S.'s new infrastructure bill. The bill, which is set to pass, features a provision regarding a cryptocurrency tax, but some argue the language surrounding the tax is broad and lacks clarity. Then, Mikah shares a story from Vice about a study in which researchers created a series of "master faces" that could be used to bypass facial recognition technology. Lastly, Jason and Mikah discuss the Samsung Unpacked event. Jason shares his thoughts on the latest announcements and the two ponder the utility of foldable phones and devices. Hosts: Jason Howell and Mikah Sargent Guests: Rene Ritchie and Nikhilesh De Download or subscribe to this show at https://twit.tv/shows/tech-news-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: itpro.tv/tnw promo code TNW30 akamai.com/tnw
Din dumma brevlådan som står på framsidan, går den att få in i ditt smarta hem? Vi diskuterar ämnet ”uppkopplade brevlådor” och tipsar om produkter och delar med oss av egna erfarenheter. Dessutom pratar vi om elcyklar kombinerat med Apple Watch och kommande funktioner till AirPods Pro som de äldre faktiskt kan ha nytta av. Den tyngsta delen i dagens avsnitt handlar om det omtalade CSAM som står för Child Sexual Abuse Material. Hela världen pratar just nu om vad Apple har för planer för att stoppa sexuellt material kopplat till barn. Givetvis är tanken jättebra, men vad händer när IT-jättarna helt plötsligt ska skanna av våra personliga bilder på våra enheter? Har du frågor eller kommentarer till oss får du gärna besöka vår hemsida 0941-podden.se där ett formulär för feedback finns redo. Annars går det utmärkt att hitta oss på sociala medier som Facebook, Instagram, Twitter eller YouTube. Hitta oss och följ oss i din podcastspelare genom att söka efter ”0941-podden”. Stötta oss genom att lämna minst 5 stjärnor i betyg och lämna gärna en kommentar i Apple Podcasts. Länkar från dagens avsnitt: Klistermärke - AirTag Locate Sticker Försäkring - Hövding cykelhjälm Smarta brevlådan - Berglund MP4000 Är du sugen på att stötta oss i vårt arbete med podden och dess utveckling? Vi har därför öppnat upp för möjligheten att skänka en engångsgåva eller att bli månadsgivare till podden genom tjänsten ”Buy me a coffee”. Vi gör podden för att det är roligt, inte för att tjäna pengar. Att bidra är helt valfritt och podden kommer alltid vara gratis att lyssna på. Du som är intresserad av att sponsra 0941-podden kan ta kontakt på hej@0941-podden.se Intromusiken är skapat av Audionautix.com
After Apple detailed its plans for protecting against Child Sexual Abuse Material (CSAM), many questioned the impact such a decision could have. The U.S. is set to pass a new infrastructure bill that features a cryptocurrency tax provision. Researchers have created what they call a set of "master faces" to bypass facial recognition. Samsung announced a slew of foldable phones at its Unpacked event. First, Rene Ritchie joins Tech News Weekly to provide an update on Apple's recently announced plans for finding Child Sexual Abuse Material on its platform. Ritchie explains the technology, details some of the criticisms, and discusses the pros and cons of end-to-end encryption. Then, Nikhilesh De of CoinDesk stops by to discuss a provision in the U.S.'s new infrastructure bill. The bill, which is set to pass, features a provision regarding a cryptocurrency tax, but some argue the language surrounding the tax is broad and lacks clarity. Then, Mikah shares a story from Vice about a study in which researchers created a series of "master faces" that could be used to bypass facial recognition technology. Lastly, Jason and Mikah discuss the Samsung Unpacked event. Jason shares his thoughts on the latest announcements and the two ponder the utility of foldable phones and devices. Hosts: Jason Howell and Mikah Sargent Guests: Rene Ritchie and Nikhilesh De Download or subscribe to this show at https://twit.tv/shows/tech-news-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: itpro.tv/tnw promo code TNW30 akamai.com/tnw
After Apple detailed its plans for protecting against Child Sexual Abuse Material (CSAM), many questioned the impact such a decision could have. The U.S. is set to pass a new infrastructure bill that features a cryptocurrency tax provision. Researchers have created what they call a set of "master faces" to bypass facial recognition. Samsung announced a slew of foldable phones at its Unpacked event. First, Rene Ritchie joins Tech News Weekly to provide an update on Apple's recently announced plans for finding Child Sexual Abuse Material on its platform. Ritchie explains the technology, details some of the criticisms, and discusses the pros and cons of end-to-end encryption. Then, Nikhilesh De of CoinDesk stops by to discuss a provision in the U.S.'s new infrastructure bill. The bill, which is set to pass, features a provision regarding a cryptocurrency tax, but some argue the language surrounding the tax is broad and lacks clarity. Then, Mikah shares a story from Vice about a study in which researchers created a series of "master faces" that could be used to bypass facial recognition technology. Lastly, Jason and Mikah discuss the Samsung Unpacked event. Jason shares his thoughts on the latest announcements and the two ponder the utility of foldable phones and devices. Hosts: Jason Howell and Mikah Sargent Guests: Rene Ritchie and Nikhilesh De Download or subscribe to this show at https://twit.tv/shows/tech-news-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: itpro.tv/tnw promo code TNW30 akamai.com/tnw
After Apple detailed its plans for protecting against Child Sexual Abuse Material (CSAM), many questioned the impact such a decision could have. The U.S. is set to pass a new infrastructure bill that features a cryptocurrency tax provision. Researchers have created what they call a set of "master faces" to bypass facial recognition. Samsung announced a slew of foldable phones at its Unpacked event. First, Rene Ritchie joins Tech News Weekly to provide an update on Apple's recently announced plans for finding Child Sexual Abuse Material on its platform. Ritchie explains the technology, details some of the criticisms, and discusses the pros and cons of end-to-end encryption. Then, Nikhilesh De of CoinDesk stops by to discuss a provision in the U.S.'s new infrastructure bill. The bill, which is set to pass, features a provision regarding a cryptocurrency tax, but some argue the language surrounding the tax is broad and lacks clarity. Then, Mikah shares a story from Vice about a study in which researchers created a series of "master faces" that could be used to bypass facial recognition technology. Lastly, Jason and Mikah discuss the Samsung Unpacked event. Jason shares his thoughts on the latest announcements and the two ponder the utility of foldable phones and devices. Hosts: Jason Howell and Mikah Sargent Guests: Rene Ritchie and Nikhilesh De Download or subscribe to this show at https://twit.tv/shows/tech-news-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: itpro.tv/tnw promo code TNW30 akamai.com/tnw
After Apple detailed its plans for protecting against Child Sexual Abuse Material (CSAM), many questioned the impact such a decision could have. The U.S. is set to pass a new infrastructure bill that features a cryptocurrency tax provision. Researchers have created what they call a set of "master faces" to bypass facial recognition. Samsung announced a slew of foldable phones at its Unpacked event. First, Rene Ritchie joins Tech News Weekly to provide an update on Apple's recently announced plans for finding Child Sexual Abuse Material on its platform. Ritchie explains the technology, details some of the criticisms, and discusses the pros and cons of end-to-end encryption. Then, Nikhilesh De of CoinDesk stops by to discuss a provision in the U.S.'s new infrastructure bill. The bill, which is set to pass, features a provision regarding a cryptocurrency tax, but some argue the language surrounding the tax is broad and lacks clarity. Then, Mikah shares a story from Vice about a study in which researchers created a series of "master faces" that could be used to bypass facial recognition technology. Lastly, Jason and Mikah discuss the Samsung Unpacked event. Jason shares his thoughts on the latest announcements and the two ponder the utility of foldable phones and devices. Hosts: Jason Howell and Mikah Sargent Guests: Rene Ritchie and Nikhilesh De Download or subscribe to this show at https://twit.tv/shows/tech-news-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: itpro.tv/tnw promo code TNW30 akamai.com/tnw
After Apple detailed its plans for protecting against Child Sexual Abuse Material (CSAM), many questioned the impact such a decision could have. The U.S. is set to pass a new infrastructure bill that features a cryptocurrency tax provision. Researchers have created what they call a set of "master faces" to bypass facial recognition. Samsung announced a slew of foldable phones at its Unpacked event. First, Rene Ritchie joins Tech News Weekly to provide an update on Apple's recently announced plans for finding Child Sexual Abuse Material on its platform. Ritchie explains the technology, details some of the criticisms, and discusses the pros and cons of end-to-end encryption. Then, Nikhilesh De of CoinDesk stops by to discuss a provision in the U.S.'s new infrastructure bill. The bill, which is set to pass, features a provision regarding a cryptocurrency tax, but some argue the language surrounding the tax is broad and lacks clarity. Then, Mikah shares a story from Vice about a study in which researchers created a series of "master faces" that could be used to bypass facial recognition technology. Lastly, Jason and Mikah discuss the Samsung Unpacked event. Jason shares his thoughts on the latest announcements and the two ponder the utility of foldable phones and devices. Hosts: Jason Howell and Mikah Sargent Guests: Rene Ritchie and Nikhilesh De Download or subscribe to this show at https://twit.tv/shows/tech-news-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: itpro.tv/tnw promo code TNW30 akamai.com/tnw
After Apple detailed its plans for protecting against Child Sexual Abuse Material (CSAM), many questioned the impact such a decision could have. The U.S. is set to pass a new infrastructure bill that features a cryptocurrency tax provision. Researchers have created what they call a set of "master faces" to bypass facial recognition. Samsung announced a slew of foldable phones at its Unpacked event. First, Rene Ritchie joins Tech News Weekly to provide an update on Apple's recently announced plans for finding Child Sexual Abuse Material on its platform. Ritchie explains the technology, details some of the criticisms, and discusses the pros and cons of end-to-end encryption. Then, Nikhilesh De of CoinDesk stops by to discuss a provision in the U.S.'s new infrastructure bill. The bill, which is set to pass, features a provision regarding a cryptocurrency tax, but some argue the language surrounding the tax is broad and lacks clarity. Then, Mikah shares a story from Vice about a study in which researchers created a series of "master faces" that could be used to bypass facial recognition technology. Lastly, Jason and Mikah discuss the Samsung Unpacked event. Jason shares his thoughts on the latest announcements and the two ponder the utility of foldable phones and devices. Hosts: Jason Howell and Mikah Sargent Guests: Rene Ritchie and Nikhilesh De Download or subscribe to this show at https://twit.tv/shows/tech-news-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: itpro.tv/tnw promo code TNW30 akamai.com/tnw
Apple CSAM, patent trolls, iPhone 13 rumors Apple announces "Expanded Protections for Children" Apple explains how iPhones will scan photos for child-sexual-abuse images Interview: Apple's Head of Privacy details child abuse detection and Messages safety features Apple says any expansion of CSAM detection outside of the US will occur on a per-country basis Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off Apple's New 'Child Safety' Initiatives, and the Slippery Slope An Open Letter Against Apple's Privacy-Invasive Content Scanning Technology One Bad Apple - The Hacker Factor Blog Apple's Mistake - Stratechery Apple Readies New iPhones With Pro-Focused Camera, Video Updates Apple keeps shutting down employee-run surveys on pay equity — and labor lawyers say it's illegal Apple Sinks 'Submarine Patent,' Escapes $308.5 Million Verdict Long Island man credits Apple Watch with saving his life after serious fall Kuo: Mid-2022 MacBook Air to come in multiple colors with similar form factor as upcoming MacBook Pro Apple and Netflix engaged in a bidding war for Jennifer Lawrence's new film Apple orders drama series 'Bad Monkey' starring Vince Vaughn, written by Ted Lasso executive producer Bill Lawrence Picks of the Week Andy's pick: Rifftrax Friends Rene's pick: Canon R3 Alex's pick: Bert Monroy's new image Leo's pick: iMazing Hosts: Leo Laporte, Andy Ihnatko, Rene Ritchie, and Alex Lindsay Download or subscribe to this show at https://twit.tv/shows/macbreak-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: udacity.com/TWiT offer code TWIT75 att.com/activearmor
Apple CSAM, patent trolls, iPhone 13 rumors Apple announces "Expanded Protections for Children" Apple explains how iPhones will scan photos for child-sexual-abuse images Interview: Apple's Head of Privacy details child abuse detection and Messages safety features Apple says any expansion of CSAM detection outside of the US will occur on a per-country basis Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off Apple's New 'Child Safety' Initiatives, and the Slippery Slope An Open Letter Against Apple's Privacy-Invasive Content Scanning Technology One Bad Apple - The Hacker Factor Blog Apple's Mistake - Stratechery Apple Readies New iPhones With Pro-Focused Camera, Video Updates Apple keeps shutting down employee-run surveys on pay equity — and labor lawyers say it's illegal Apple Sinks 'Submarine Patent,' Escapes $308.5 Million Verdict Long Island man credits Apple Watch with saving his life after serious fall Kuo: Mid-2022 MacBook Air to come in multiple colors with similar form factor as upcoming MacBook Pro Apple and Netflix engaged in a bidding war for Jennifer Lawrence's new film Apple orders drama series 'Bad Monkey' starring Vince Vaughn, written by Ted Lasso executive producer Bill Lawrence Picks of the Week Andy's pick: Rifftrax Friends Rene's pick: Canon R3 Alex's pick: Bert Monroy's new image Leo's pick: iMazing Hosts: Leo Laporte, Andy Ihnatko, Rene Ritchie, and Alex Lindsay Download or subscribe to this show at https://twit.tv/shows/macbreak-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: udacity.com/TWiT offer code TWIT75 att.com/activearmor
Apple CSAM, patent trolls, iPhone 13 rumors Apple announces "Expanded Protections for Children" Apple explains how iPhones will scan photos for child-sexual-abuse images Interview: Apple's Head of Privacy details child abuse detection and Messages safety features Apple says any expansion of CSAM detection outside of the US will occur on a per-country basis Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off Apple's New 'Child Safety' Initiatives, and the Slippery Slope An Open Letter Against Apple's Privacy-Invasive Content Scanning Technology One Bad Apple - The Hacker Factor Blog Apple's Mistake - Stratechery Apple Readies New iPhones With Pro-Focused Camera, Video Updates Apple keeps shutting down employee-run surveys on pay equity — and labor lawyers say it's illegal Apple Sinks 'Submarine Patent,' Escapes $308.5 Million Verdict Long Island man credits Apple Watch with saving his life after serious fall Kuo: Mid-2022 MacBook Air to come in multiple colors with similar form factor as upcoming MacBook Pro Apple and Netflix engaged in a bidding war for Jennifer Lawrence's new film Apple orders drama series 'Bad Monkey' starring Vince Vaughn, written by Ted Lasso executive producer Bill Lawrence Picks of the Week Andy's pick: Rifftrax Friends Rene's pick: Canon R3 Alex's pick: Bert Monroy's new image Leo's pick: iMazing Hosts: Leo Laporte, Andy Ihnatko, Rene Ritchie, and Alex Lindsay Download or subscribe to this show at https://twit.tv/shows/macbreak-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: udacity.com/TWiT offer code TWIT75 att.com/activearmor
Apple CSAM, patent trolls, iPhone 13 rumors Apple announces "Expanded Protections for Children" Apple explains how iPhones will scan photos for child-sexual-abuse images Interview: Apple's Head of Privacy details child abuse detection and Messages safety features Apple says any expansion of CSAM detection outside of the US will occur on a per-country basis Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off Apple's New 'Child Safety' Initiatives, and the Slippery Slope An Open Letter Against Apple's Privacy-Invasive Content Scanning Technology One Bad Apple - The Hacker Factor Blog Apple's Mistake - Stratechery Apple Readies New iPhones With Pro-Focused Camera, Video Updates Apple keeps shutting down employee-run surveys on pay equity — and labor lawyers say it's illegal Apple Sinks 'Submarine Patent,' Escapes $308.5 Million Verdict Long Island man credits Apple Watch with saving his life after serious fall Kuo: Mid-2022 MacBook Air to come in multiple colors with similar form factor as upcoming MacBook Pro Apple and Netflix engaged in a bidding war for Jennifer Lawrence's new film Apple orders drama series 'Bad Monkey' starring Vince Vaughn, written by Ted Lasso executive producer Bill Lawrence Picks of the Week Andy's pick: Rifftrax Friends Rene's pick: Canon R3 Alex's pick: Bert Monroy's new image Leo's pick: iMazing Hosts: Leo Laporte, Andy Ihnatko, Rene Ritchie, and Alex Lindsay Download or subscribe to this show at https://twit.tv/shows/macbreak-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: udacity.com/TWiT offer code TWIT75 att.com/activearmor
Apple CSAM, patent trolls, iPhone 13 rumors Apple announces "Expanded Protections for Children" Apple explains how iPhones will scan photos for child-sexual-abuse images Interview: Apple's Head of Privacy details child abuse detection and Messages safety features Apple says any expansion of CSAM detection outside of the US will occur on a per-country basis Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off Apple's New 'Child Safety' Initiatives, and the Slippery Slope An Open Letter Against Apple's Privacy-Invasive Content Scanning Technology One Bad Apple - The Hacker Factor Blog Apple's Mistake - Stratechery Apple Readies New iPhones With Pro-Focused Camera, Video Updates Apple keeps shutting down employee-run surveys on pay equity — and labor lawyers say it's illegal Apple Sinks 'Submarine Patent,' Escapes $308.5 Million Verdict Long Island man credits Apple Watch with saving his life after serious fall Kuo: Mid-2022 MacBook Air to come in multiple colors with similar form factor as upcoming MacBook Pro Apple and Netflix engaged in a bidding war for Jennifer Lawrence's new film Apple orders drama series 'Bad Monkey' starring Vince Vaughn, written by Ted Lasso executive producer Bill Lawrence Picks of the Week Andy's pick: Rifftrax Friends Rene's pick: Canon R3 Alex's pick: Bert Monroy's new image Leo's pick: iMazing Hosts: Leo Laporte, Andy Ihnatko, Rene Ritchie, and Alex Lindsay Download or subscribe to this show at https://twit.tv/shows/macbreak-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: udacity.com/TWiT offer code TWIT75 att.com/activearmor
Apple CSAM, patent trolls, iPhone 13 rumors Apple announces "Expanded Protections for Children" Apple explains how iPhones will scan photos for child-sexual-abuse images Interview: Apple's Head of Privacy details child abuse detection and Messages safety features Apple says any expansion of CSAM detection outside of the US will occur on a per-country basis Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off Apple's New 'Child Safety' Initiatives, and the Slippery Slope An Open Letter Against Apple's Privacy-Invasive Content Scanning Technology One Bad Apple - The Hacker Factor Blog Apple's Mistake - Stratechery Apple Readies New iPhones With Pro-Focused Camera, Video Updates Apple keeps shutting down employee-run surveys on pay equity — and labor lawyers say it's illegal Apple Sinks 'Submarine Patent,' Escapes $308.5 Million Verdict Long Island man credits Apple Watch with saving his life after serious fall Kuo: Mid-2022 MacBook Air to come in multiple colors with similar form factor as upcoming MacBook Pro Apple and Netflix engaged in a bidding war for Jennifer Lawrence's new film Apple orders drama series 'Bad Monkey' starring Vince Vaughn, written by Ted Lasso executive producer Bill Lawrence Picks of the Week Andy's pick: Rifftrax Friends Rene's pick: Canon R3 Alex's pick: Bert Monroy's new image Leo's pick: iMazing Hosts: Leo Laporte, Andy Ihnatko, Rene Ritchie, and Alex Lindsay Download or subscribe to this show at https://twit.tv/shows/macbreak-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: udacity.com/TWiT offer code TWIT75 att.com/activearmor
Apple CSAM, patent trolls, iPhone 13 rumors Apple announces "Expanded Protections for Children" Apple explains how iPhones will scan photos for child-sexual-abuse images Interview: Apple's Head of Privacy details child abuse detection and Messages safety features Apple says any expansion of CSAM detection outside of the US will occur on a per-country basis Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off Apple's New 'Child Safety' Initiatives, and the Slippery Slope An Open Letter Against Apple's Privacy-Invasive Content Scanning Technology One Bad Apple - The Hacker Factor Blog Apple's Mistake - Stratechery Apple Readies New iPhones With Pro-Focused Camera, Video Updates Apple keeps shutting down employee-run surveys on pay equity — and labor lawyers say it's illegal Apple Sinks 'Submarine Patent,' Escapes $308.5 Million Verdict Long Island man credits Apple Watch with saving his life after serious fall Kuo: Mid-2022 MacBook Air to come in multiple colors with similar form factor as upcoming MacBook Pro Apple and Netflix engaged in a bidding war for Jennifer Lawrence's new film Apple orders drama series 'Bad Monkey' starring Vince Vaughn, written by Ted Lasso executive producer Bill Lawrence Picks of the Week Andy's pick: Rifftrax Friends Rene's pick: Canon R3 Alex's pick: Bert Monroy's new image Leo's pick: iMazing Hosts: Leo Laporte, Andy Ihnatko, Rene Ritchie, and Alex Lindsay Download or subscribe to this show at https://twit.tv/shows/macbreak-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: udacity.com/TWiT offer code TWIT75 att.com/activearmor
Apple CSAM, patent trolls, iPhone 13 rumors Apple announces "Expanded Protections for Children" Apple explains how iPhones will scan photos for child-sexual-abuse images Interview: Apple's Head of Privacy details child abuse detection and Messages safety features Apple says any expansion of CSAM detection outside of the US will occur on a per-country basis Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off Apple's New 'Child Safety' Initiatives, and the Slippery Slope An Open Letter Against Apple's Privacy-Invasive Content Scanning Technology One Bad Apple - The Hacker Factor Blog Apple's Mistake - Stratechery Apple Readies New iPhones With Pro-Focused Camera, Video Updates Apple keeps shutting down employee-run surveys on pay equity — and labor lawyers say it's illegal Apple Sinks 'Submarine Patent,' Escapes $308.5 Million Verdict Long Island man credits Apple Watch with saving his life after serious fall Kuo: Mid-2022 MacBook Air to come in multiple colors with similar form factor as upcoming MacBook Pro Apple and Netflix engaged in a bidding war for Jennifer Lawrence's new film Apple orders drama series 'Bad Monkey' starring Vince Vaughn, written by Ted Lasso executive producer Bill Lawrence Picks of the Week Andy's pick: Rifftrax Friends Rene's pick: Canon R3 Alex's pick: Bert Monroy's new image Leo's pick: iMazing Hosts: Leo Laporte, Andy Ihnatko, Rene Ritchie, and Alex Lindsay Download or subscribe to this show at https://twit.tv/shows/macbreak-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: udacity.com/TWiT offer code TWIT75 att.com/activearmor
Apple CSAM, patent trolls, iPhone 13 rumors Apple announces "Expanded Protections for Children" Apple explains how iPhones will scan photos for child-sexual-abuse images Interview: Apple's Head of Privacy details child abuse detection and Messages safety features Apple says any expansion of CSAM detection outside of the US will occur on a per-country basis Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off Apple's New 'Child Safety' Initiatives, and the Slippery Slope An Open Letter Against Apple's Privacy-Invasive Content Scanning Technology One Bad Apple - The Hacker Factor Blog Apple's Mistake - Stratechery Apple Readies New iPhones With Pro-Focused Camera, Video Updates Apple keeps shutting down employee-run surveys on pay equity — and labor lawyers say it's illegal Apple Sinks 'Submarine Patent,' Escapes $308.5 Million Verdict Long Island man credits Apple Watch with saving his life after serious fall Kuo: Mid-2022 MacBook Air to come in multiple colors with similar form factor as upcoming MacBook Pro Apple and Netflix engaged in a bidding war for Jennifer Lawrence's new film Apple orders drama series 'Bad Monkey' starring Vince Vaughn, written by Ted Lasso executive producer Bill Lawrence Picks of the Week Andy's pick: Rifftrax Friends Rene's pick: Canon R3 Alex's pick: Bert Monroy's new image Leo's pick: iMazing Hosts: Leo Laporte, Andy Ihnatko, Rene Ritchie, and Alex Lindsay Download or subscribe to this show at https://twit.tv/shows/macbreak-weekly. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: udacity.com/TWiT offer code TWIT75 att.com/activearmor
Google announces new Nest cams that are more useful without a subscription; Apple combats Child Sexual Abuse Material; American Airlines gives passengers free in-flight TikTok; Citizen introduces a new protection subscription; Vudu replaces FandangoNow and a website that ports SiriusXM channels to Spotify playlists.Listeners ask about recovering iPhone home pages; what do to about calendar spam; the difference between Apple Watch models; recovering lost iMessages and whether cash back apps are trustworthy.LinksRich on InstagramRich on TwitterGoogle NestApple CSAMAA TikTokCitizen ProtectVuduSiriusXM Spotify PlaylistsiPhone Home PagesCalendar SpamApple Watch model comparisonLost iMessage softwareSee Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Yesterday, word began circulating that Apple is about to release a new version of its operating system for all iPhones around the world. The new iOS will quickly identify Child Sexual Abuse Material, or child pornography. Learn more about your ad choices. Visit megaphone.fm/adchoices
Yesterday, word began circulating that Apple is about to release a new version of its operating system for all iPhones around the world. The new iOS will quickly identify Child Sexual Abuse Material, or child pornography. Learn more about your ad choices. Visit megaphone.fm/adchoices
Our guest on the pod this week is Chris Wexler. Chris is the CEO of Krunam, a technology for good company fighting Child Sexual Abuse Material and a number of other of the internet's most difficult challenges. Resources mentioned in this episode: Leesa Renee Hall interview Krunam site The Podcast Success Team Paul's business coaching site
With more people working from home than ever before due to the pandemic,. But another parallel pandemic is taking place, that of online child sexual abuse material which has taken a sharp rise around the world. In this podcast, we're discussing Child Online Sexual Abuse (CSAM), COVID and technology. Presenter(s): https://globalinitiative.net/profile/lucia-bird-ruiz-benitez-de-lugo/ (Lucia Bird Ruiz-Benitez de Lugo) and Jack Meegan-Vickers. Speakers: https://www.mnemonic.no/midnightsun2020/fernando-ruiz/ (Fernando Ruiz), Head of Operations at the https://www.europol.europa.eu/about-europol/european-cybercrime-centre-ec3 (European Cybercrime Centre), set up by Europol to co-ordinate crossborder investigations into cybercrime. http://www.tipheroes.org/amela-efendic/ (Amela Efendic) is the director of the http://www.eurcenter.net/ (European Resource Centre for the Prevention of Trafficking) and head of the https://www.emmaus-international.org/en/ (Bosnia and Herzegovina Office for the International Forum for Solidarity-Emmaus). https://twitter.com/Judiekaberia?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor (Judie Kaberia) is a fellow of the 2020 https://resiliencefund.globalinitiative.net/ (Resilience Fund of the Global Initiative against Transnational Organized Crime.) GI Research Paper: https://globalinitiative.net/analysis/cyber-tech-organized-crime/ (Transformative Technologies: How digital is changing the landscape of organized crime.) Global Initiative Against Transnational Organized Crime