POPULARITY
Before the U.S. carried out missile strikes against Houthi rebels in Yemen, senior Trump administration officials discussed the plan of action. Also part of the discussion: Jeffrey Goldberg, the editor of The Atlantic, who had inadvertently been added to a group message on Signal about the missile strike. How did this happen, and what are the implications for national security? Note: NPR CEO Katherine Maher is chair of the board of the Signal Foundation, whose subsidiary makes Signal. This episode: political correspondent Sarah McCammon, White House correspondent Deepa Shivaram, and national security correspondent Greg Myre.The podcast is produced by Bria Suggs & Kelli Wessinger and edited by Casey Morell. Our executive producer is Muthoni Muturi.Listen to every episode of the NPR Politics Podcast sponsor-free, unlock access to bonus episodes with more from the NPR Politics team, and support public media when you sign up for The NPR Politics Podcast+ at plus.npr.org/politics.Learn more about sponsor message choices: podcastchoices.com/adchoicesNPR Privacy Policy
Meredith Whittaker is the president of the Signal Foundation and serves on its board of directors. She is also the co-founder of NYU’s AI Now Institute. Whittaker got her start at Google, where she worked for 13 years until resigning in 2019 after she helped organize the Google Walkouts. She speaks with Oz about learning on the job, championing data privacy and being awarded the Helmut Schmidt Future Prize for “her commitment to the development of AI technology oriented towards the common good.”See omnystudio.com/listener for privacy information.
Child trafficking is a heartbreaking and urgent issue, with young lives being exploited in ways that are hard to imagine. Children are taken from their homes, often forced into dangerous and abusive situations with little chance to escape. The trauma they experience can have long-lasting effects on their physical and emotional health. Raising awareness about the harrowing realities of child trafficking is essential to protecting vulnerable children and pushing for change. Matt Murphy is a former Green Beret and CIA agent who has dedicated his life to combating human trafficking, particularly the exploitation of children. He currently serves as the director of US operations for the Signal Foundation, a non-profit organization focused on rescuing victims of child trafficking and providing humanitarian aid in crisis situations. Today, Matt discusses the systemic issues surrounding human trafficking in the United States, including the involvement of government agencies and high-level individuals. Join in as young leaders pose inquiries about life, personal development, and beyond! Quotes: “The most dangerous place for your child to be is on an iPhone—statistically speaking, that's a fact. An iPhone or an iPad is, indeed, one of the most dangerous places for your child.” – Matt Murphy “When you give your children access to the world, you also give the world access to them. That's something we, as dads, really need to be aware of. We need to step back and ask, ‘Hey, how are we going to lead our family?'” – Matt Murphy “The biggest priority is trying to get foster homes, orphanages, and churches involved, especially through funding and connecting with good people in the community.” – Matt Murphy “It's easier to build strong children than to fix broken men. Rescuing them and giving them the opportunity to grow into strong individuals who can, in turn, perpetuate good people is a big thing.” – Matt Beaudreau Takeaways: Become more aware of the child trafficking crisis and its prevalence, both online and in local communities. Educate yourself and others on the warning signs and ways to report suspected abuse. Prioritize being a strong, engaged parent and role model for children in your life. Limit unsupervised digital access and foster open communication to protect them from online predators. Consider supporting organizations like the Signal Foundation, either through financial donations or by exploring volunteer opportunities, such as the operator training course. Advocate for policy changes and increased funding to address the systemic failures in the foster care system and child protective services that enable trafficking to thrive. Reflect on your own role as a citizen and community member. What can you do to support vulnerable children and families, and help dismantle the networks that enable this abhorrent trade? Conclusion: The child trafficking crisis is a pervasive, systemic issue that affects communities both domestically and globally. Addressing this crisis requires a comprehensive approach that strengthens families, empowers communities, and holds institutions accountable. It is equally essential to confront the complicity and negligence of those in positions of power who may enable or overlook these abuses. Only through coordinated efforts and unwavering commitment can we begin to dismantle the structures that allow such exploitation to persist. Working together, we can create safer environments and protect vulnerable children worldwide.
What do cybersecurity experts, journalists in foreign conflicts, indicted New York City Mayor Eric Adams and Drake have in common? They all use the Signal messaging app. Signal's protocol has been the gold standard in end-to-end encryption, used by Whatsapp, Google and more, for more than a decade. But it's been under fire from both authoritarian governments and well-meaning democracies who see the privacy locks as a threat. Since 2022, former Google rabble-rouser and AI Now Institute co-founder Meredith Whittaker has been president of the Signal Foundation, the nonprofit that runs the app. Kara talks with Meredith about her fight to protect text privacy, the consolidation of power and money in AI and how nonprofits can survive in a world built on the surveillance economy. Questions? Comments? Email us at on@voxmedia.com or find Kara on Threads/Instagram @karaswisher Learn more about your ad choices. Visit podcastchoices.com/adchoices
For this episode we made an exception and decided to record an interview in English. We are talking to Meredith Whittaker, president of the Signal Foundation, a non-profit company that's developing the Signal app that is world's most popular Multiplatform private encrypted messenger system. We talk to Meredith about her way into the digital realm, how she shook up Google by organizing walkouts and more or less erasing all memories to it's "don't be evil" motto, how Signal came to be and what its principles are, how she views Europe and the regulations policies of the EU and much much more.
This is a special interview episode with Meredith Whittaker, the president of the Signal Foundation. I'm sure you all know, and maybe even use, the Signal messaging app. Here we sat down with Whittaker to talk all about the state of Signal today, the threat of AI to end-to-end encryption, what backdoors actually look like, and much more. This is a wide-ranging discussion where one of the few journalists who has revealed new details about backdoors (Joseph) gets to speak to one of the most important people in the world of encryption (Whittaker). Definitely take a listen. Paid subscribers got access to this episode early by the way. Dark Wire: The Incredible True Story of the Largest Sting Operation Ever Signal page on government data requests Microsoft Will Switch Off Recall by Default After Security Backlash Telegram CEO Pavel Durov interview Subscribe at 404media.co for early access and bonus content. Learn more about your ad choices. Visit megaphone.fm/adchoices
In this episode of Technology and Security, Dr Miah Hammond-Errey speaks with Meredith Whittaker, president of Signal. The interview explores key contemporary issues in technology and Artificial Intelligence (AI). They discuss the impact of AI in elections and democracies, including the need for stronger local media ecosystems and improved focus on the ‘mediating' role of social media platforms and the information ecosystem. They discuss the concentration of AI power and reliance of the business model on mass collection, including the need to write the tech stack for privacy, not surveillance. This episode also explores developing democratically focused public digital infrastructure without profit incentives and highlights the role of open-source libraries and systems as part of the core infrastructure of the technology ecosystem. This episode also covers the significance of autonomy and agency in neurotech applications. They discuss how to improve tech board governance, through increased personal liability, accountability and transparency. Also, how many downloads signal has actually had! Meredith Whittaker is the president of Signal Foundation. She has nearly 20 years of experience in the tech industry, academia, and government and co-founded the AI Now Institute. Resources mentioned in the recording: · Meredith Whittaker, link to talk· Meredith Whittaker, link to reading · Meredith Whittaker, link to watching · Meredith Whittaker, link to listening · Miah Hammond-Errey, 2024, Big Data, Emerging Technologies and Intelligence: National Security Disrupted, Routledge (20% discount code for book AFL04)· Byte-sized diplomacy (column), The Interpreter, 3 July 2024, AI-enabled elections or deepfake democracy? This podcast was recorded on the lands of the Gadigal people, and we pay our respects to their Elders past, present and emerging. We acknowledge their continuing connection to land, sea and community, and extend that respect to all Aboriginal and Torres Strait Islander people. Thanks to the talents of those involved. Music by Dr Paul Mac and production by Elliott Brennan.
Frauen in Vorständen – das ist immer noch selten in Deutschland. Auch Start-up-Gründer sind mehrheitlich Männer. Woran liegt das? Geht es doch nicht ohne Quote? Jedenfalls das Netzwerken haben Frauen gelernt: Im Verband deutscher Unternehmerinnen haben sich 1800 Frauen zusammengeschlossen. Die neue Präsidentin, Christina Diem-Puello, im Gespräch mit Helene Bubrowski.Die NATO-Mitglieder haben zum vielleicht letzten Mal Milliardenhilfen für die Ukraine beschlossen. In der Abschlusserklärung wird der NATO-Beitritt der Ukraine als „unumkehrbar“ bezeichnet. Michael Bröcker berichtet direkt aus Washington vom NATO-Gipfel.Wie groß ist die Gefahr durch KI? Wie stark sollten KI und mit ihr die großen Digital-Player reguliert werden? Meredith Whittaker, KI-Forscherin und Präsidentin der amerikanischen Signal-Foundation, bezieht klar Position. Table.Briefings - For better informed decisions. Sie entscheiden besser, weil Sie besser informiert sind – das ist das Ziel von Table.Briefings. Wir verschaffen Ihnen mit jedem Professional Briefing, mit jeder Analyse und mit jedem Hintergrundstück einen Informationsvorsprung, am besten sogar einen Wettbewerbsvorteil. Table.Briefings bietet „Deep Journalism“, wir verbinden den Qualitätsanspruch von Leitmedien mit der Tiefenschärfe von Fachinformationen. Professional Briefings kostenlos kennenlernen:table.media/registrierung. Hosted on Acast. See acast.com/privacy for more information.
Meredith Whittaker turned her back on Google after raising concerns about the mass surveillance fueling AI, but she didn't leave tech entirely. The former AI whistleblower is now the President of Signal, a messaging app that keeps conversations encrypted – used by journalists, whistleblowers, drug dealers, militants and others who want to keep communications secure. So why did she blow the whistle on Google? Is privacy the answer to AI? Or does privacy cause just as much harm as surveillance? Today, President of the Signal Foundation Meredith Whittaker, ahead of her public appearance at The Wheeler Centre in Melbourne, on the tech giants who hold our future in their hands. Socials: Stay in touch with us on Twitter and Instagram Guest: President of the Signal Foundation, Meredith Whittaker
When social media is at its best, we get genuine human connection, built-in audiences, and exciting avenues for creativity and exchange. But our current social platforms are built on a surveillance model, where our data is used to predict our behavior, show us ads, and train the algorithms that keep us perpetually on the platform. It's time to explore a new vision for social media, where we don't have to give up on privacy in order to connect. In this episode, Raffi talks to prominent critics of existing social media — and the people actively reimagining it, with truly private messaging, hyperlocal communities, and renewed sense of control over our own social data. Guests include Facebook whistleblower Frances Haugen, whose 2021 leaks made national news and put the social media giant in the Congressional spotlight; scholar and internet activist Ethan Zuckerman; Meredith Whittaker, the president of the Signal Foundation; Flipboard co-founder Mike McCue; and Harvard Law professor Jonathan Zittrain. To learn more about Technically Optimistic and to read the transcript for this episode: emersoncollective.com/technically-optimistic-podcast For more on Emerson Collective: emersoncollective.com Learn more about our host, Raffi Krikorian: emersoncollective.com/raffi Technically Optimistic is produced by Emerson Collective with music by Mattie Safer. Subscribe to our weekly newsletter: technicallyoptimistic.substack.com Follow on social media @emersoncollective and @emcollectivepodcasts Email us with questions and feedback at us@technicallyoptimistic.com To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy Learn more about your ad choices. Visit https://podcastchoices.com/adchoices
科學麵麵俱到! 國民零食科學麵,酥脆麵體搭配經典椒鹽粉,一口接一口 讓你新春聚會樂滿堂,平安順利龍總來! 想要歡樂一整天就找科學麵!→ https://bit.ly/44S8FtB ----以上訊息由 SoundOn 動態廣告贊助商提供---- In this riveting episode of #InnoMinds, Audrey Tang, Taiwan's Digital Minister, and Meredith Whittaker, President of the Signal Foundation, delve into the intricate realm of online privacy. They dissect the significance of end-to-end encryption and why it falls short in safeguarding our online privacy. Audrey and Meredith also share insights into the latest developments in the open-source world, AI, and their personal connection to the open-source community. Host ⎸ Arnaud Campagne (TaiwanPlus) Guests ⎸ Meredith Whittaker and Audrey Tang Season 2 of Innovative Minds deep-dives into artificial intelligence, digital democracy, and freedom of expression with leading tech figures. This podcast is released under a CC BY 4.0. Creative Commons licence.
While the What Next: TBD team spends some time with their families during the holidays, we revisit some of 2023's biggest, strangest, and best stories. Regularly scheduled programming resumes in January. Artificial intelligence—as it already exists today—is drawing from huge troves of surveillance data and is rife with the biases built into the algorithm, in service of the huge corporations that develop and maintain the systems. The fight for the future doesn't look like war with Skynet; it's happening right now on the lines of the Writer's Guild strike. Guests: Meredith Whittaker, president of the Signal Foundation, co-founder of the AI Now Institute at NYU Originally aired May 12th, 2023 Learn more about your ad choices. Visit megaphone.fm/adchoices
While the What Next: TBD team spends some time with their families during the holidays, we revisit some of 2023's biggest, strangest, and best stories. Regularly scheduled programming resumes in January. Artificial intelligence—as it already exists today—is drawing from huge troves of surveillance data and is rife with the biases built into the algorithm, in service of the huge corporations that develop and maintain the systems. The fight for the future doesn't look like war with Skynet; it's happening right now on the lines of the Writer's Guild strike. Guests: Meredith Whittaker, president of the Signal Foundation, co-founder of the AI Now Institute at NYU Originally aired May 12th, 2023 Learn more about your ad choices. Visit megaphone.fm/adchoices
While the What Next: TBD team spends some time with their families during the holidays, we revisit some of 2023's biggest, strangest, and best stories. Regularly scheduled programming resumes in January. Artificial intelligence—as it already exists today—is drawing from huge troves of surveillance data and is rife with the biases built into the algorithm, in service of the huge corporations that develop and maintain the systems. The fight for the future doesn't look like war with Skynet; it's happening right now on the lines of the Writer's Guild strike. Guests: Meredith Whittaker, president of the Signal Foundation, co-founder of the AI Now Institute at NYU Originally aired May 12th, 2023 Learn more about your ad choices. Visit megaphone.fm/adchoices
While the What Next: TBD team spends some time with their families during the holidays, we revisit some of 2023's biggest, strangest, and best stories. Regularly scheduled programming resumes in January. Artificial intelligence—as it already exists today—is drawing from huge troves of surveillance data and is rife with the biases built into the algorithm, in service of the huge corporations that develop and maintain the systems. The fight for the future doesn't look like war with Skynet; it's happening right now on the lines of the Writer's Guild strike. Guests: Meredith Whittaker, president of the Signal Foundation, co-founder of the AI Now Institute at NYU Originally aired May 12th, 2023 Learn more about your ad choices. Visit megaphone.fm/adchoices
If Then | News on technology, Silicon Valley, politics, and tech policy
While the What Next: TBD team spends some time with their families during the holidays, we revisit some of 2023's biggest, strangest, and best stories. Regularly scheduled programming resumes in January. Artificial intelligence—as it already exists today—is drawing from huge troves of surveillance data and is rife with the biases built into the algorithm, in service of the huge corporations that develop and maintain the systems. The fight for the future doesn't look like war with Skynet; it's happening right now on the lines of the Writer's Guild strike. Guests: Meredith Whittaker, president of the Signal Foundation, co-founder of the AI Now Institute at NYU Originally aired May 12th, 2023 Learn more about your ad choices. Visit megaphone.fm/adchoices
While the What Next: TBD team spends some time with their families during the holidays, we revisit some of 2023's biggest, strangest, and best stories. Regularly scheduled programming resumes in January. Artificial intelligence—as it already exists today—is drawing from huge troves of surveillance data and is rife with the biases built into the algorithm, in service of the huge corporations that develop and maintain the systems. The fight for the future doesn't look like war with Skynet; it's happening right now on the lines of the Writer's Guild strike. Guests: Meredith Whittaker, president of the Signal Foundation, co-founder of the AI Now Institute at NYU Originally aired May 12th, 2023 Learn more about your ad choices. Visit megaphone.fm/adchoices
本集節目由【NO MEATING 一植肉】贊助播出 台畜旗下植物肉品牌 NO MEATING 一植肉最近推出了新產品線「植點心」,首發的胡椒燒餅與松露燒賣不僅好吃,料理起來也非常方便,讓我每天下午都很期待有它們相伴的點心時間。 先說我最愛的胡椒燒餅,內餡的口感幾乎就是絞肉,而且口口有汁卻不會有過於油膩的感覺。當然,餅皮的甜香、鮮蔥的清香、黑胡椒的辛香的香氣三重奏也是必不可少,從沒想過蔬食小點可以這麼有「罪惡感」。 緊接著是優雅的松露燒賣,這款可以說是下重本,不是使用「松露風味」的調味劑,而是用上天然的黑松露醬,咬下時就能感受到奔放的黑松露馥郁芬芳。內餡混入了多樣的新鮮菇菌類,包括香菇、杏鮑菇、洋菇以及木耳。不只增加了不同的香氣,還給了豐富的口感。 讓我更喜歡這兩款小點的是它們熱量更低、脂肪更少、零膽固醇,且富含膳食纖維。不論你是吃素、希望少吃點肉,或單純想要擁有可簡單上桌的優質小點,我都很推薦,一次兼顧味蕾與健康。 目前植點心正在限時限量接受預購中,最划算的是家庭號,共有松露燒賣、胡椒燒餅各 6 盒,一個人吃不完還可以跟朋友一起湊團:https://r.zecz.ec/RgPH -- (00:00) 《曼報》podcast 的起源 (14:34) 隱私的理念無價,但經營的現實卻很貴 (40:12) Signal Foundation 的成立故事 -- 商業合作報價:https://manny-li.com/sponsor/ 訂閱電子報:https://manny-li.com 追蹤 IG:@manny_li 追蹤 FB:manny yh li Powered by Firstory Hosting
Meredith Whittaker, president of the Signal Foundation, says a leaked French government memo risked undermining public trust in cybersecurity protocols, after it was revealed that Prime Minister Élisabeth Borne had ordered cabinet members and their staff to delete popular messaging apps like Signal and WhatsApp.
Artificial Intelligence (AI) is on every business leader's agenda. How do you ensure the AI systems you deploy are harmless and trustworthy? This month, Azeem picks some of his favorite conversations with leading AI safety experts to help you break through the noise. Today's pick is Azeem's conversation with Meredith Whittaker, president of the Signal Foundation. Meredith is a co-founder and chief advisor of the AI Now Institute, an independent research group looking at the social impact of artificial intelligence.
Internet of Humans, with Jillian York & Konstantinos Komaitis
In this episode, Konstantinos Komaitis and Jillian York talk with Meredith Whittaker,the President of the Signal Foundation. Meredith is also the current Chief Advisor, and the former Faculty Director and Co-Founder of the AI Now Institute. Meredith shares with us the role of Signal in ensuring the privacy and security of communications and her plans for the messaging service. We also discuss the fight for encryption, as pressure to undermine it in jurisdictions across the world is increasing. Finally, we also discuss about AI, its governance and the concerns that have recently manifested by policy makers. This podcast is edited by Javier Pallero. The music in this episode is Nightlapse by Arthur Vyncke | https://soundcloud.com/arthurvostMusic promoted by https://www.free-stock-music.comCreative Commons / Attribution-ShareAlike 3.0 Unported (CC BY-SA 3.0)https://creativecommons.org/licenses/by-sa/3.0/deed.en_US
Artificial intelligence—as it already exists today—is drawing from huge troves of surveillance data and is rife with the biases built into the algorithm, in service of the huge corporations that develop and maintain the systems. The fight for the future doesn't look like war with Skynet; it's happening right now on the lines of the Writer's Guild strike. Guests: Meredith Whittaker, president of the Signal Foundation, co-founder of the AI Now Institute at NYU If you enjoy this show, please consider signing up for Slate Plus. Slate Plus members get benefits like zero ads on any Slate podcast, bonus episodes of shows like Slow Burn and Dear Prudence—and you'll be supporting the work we do here on What Next TBD. Sign up now at slate.com/whatnextplus to help support our work. Learn more about your ad choices. Visit megaphone.fm/adchoices
Artificial intelligence—as it already exists today—is drawing from huge troves of surveillance data and is rife with the biases built into the algorithm, in service of the huge corporations that develop and maintain the systems. The fight for the future doesn't look like war with Skynet; it's happening right now on the lines of the Writer's Guild strike. Guests: Meredith Whittaker, president of the Signal Foundation, co-founder of the AI Now Institute at NYU If you enjoy this show, please consider signing up for Slate Plus. Slate Plus members get benefits like zero ads on any Slate podcast, bonus episodes of shows like Slow Burn and Dear Prudence—and you'll be supporting the work we do here on What Next TBD. Sign up now at slate.com/whatnextplus to help support our work. Learn more about your ad choices. Visit megaphone.fm/adchoices
Artificial intelligence—as it already exists today—is drawing from huge troves of surveillance data and is rife with the biases built into the algorithm, in service of the huge corporations that develop and maintain the systems. The fight for the future doesn't look like war with Skynet; it's happening right now on the lines of the Writer's Guild strike. Guests: Meredith Whittaker, president of the Signal Foundation, co-founder of the AI Now Institute at NYU If you enjoy this show, please consider signing up for Slate Plus. Slate Plus members get benefits like zero ads on any Slate podcast, bonus episodes of shows like Slow Burn and Dear Prudence—and you'll be supporting the work we do here on What Next TBD. Sign up now at slate.com/whatnextplus to help support our work. Learn more about your ad choices. Visit megaphone.fm/adchoices
Artificial intelligence—as it already exists today—is drawing from huge troves of surveillance data and is rife with the biases built into the algorithm, in service of the huge corporations that develop and maintain the systems. The fight for the future doesn't look like war with Skynet; it's happening right now on the lines of the Writer's Guild strike. Guests: Meredith Whittaker, president of the Signal Foundation, co-founder of the AI Now Institute at NYU If you enjoy this show, please consider signing up for Slate Plus. Slate Plus members get benefits like zero ads on any Slate podcast, bonus episodes of shows like Slow Burn and Dear Prudence—and you'll be supporting the work we do here on What Next TBD. Sign up now at slate.com/whatnextplus to help support our work. Learn more about your ad choices. Visit megaphone.fm/adchoices
If Then | News on technology, Silicon Valley, politics, and tech policy
Artificial intelligence—as it already exists today—is drawing from huge troves of surveillance data and is rife with the biases built into the algorithm, in service of the huge corporations that develop and maintain the systems. The fight for the future doesn't look like war with Skynet; it's happening right now on the lines of the Writer's Guild strike. Guests: Meredith Whittaker, president of the Signal Foundation, co-founder of the AI Now Institute at NYU If you enjoy this show, please consider signing up for Slate Plus. Slate Plus members get benefits like zero ads on any Slate podcast, bonus episodes of shows like Slow Burn and Dear Prudence—and you'll be supporting the work we do here on What Next TBD. Sign up now at slate.com/whatnextplus to help support our work. Learn more about your ad choices. Visit megaphone.fm/adchoices
Artificial intelligence—as it already exists today—is drawing from huge troves of surveillance data and is rife with the biases built into the algorithm, in service of the huge corporations that develop and maintain the systems. The fight for the future doesn't look like war with Skynet; it's happening right now on the lines of the Writer's Guild strike. Guests: Meredith Whittaker, president of the Signal Foundation, co-founder of the AI Now Institute at NYU If you enjoy this show, please consider signing up for Slate Plus. Slate Plus members get benefits like zero ads on any Slate podcast, bonus episodes of shows like Slow Burn and Dear Prudence—and you'll be supporting the work we do here on What Next TBD. Sign up now at slate.com/whatnextplus to help support our work. Learn more about your ad choices. Visit megaphone.fm/adchoices
What happens when an outspoken critic of the technology industry finds herself at the helm of one of the largest messaging apps in the world? Meredith Whittaker made her name as one of the tech industry's strongest internal critics, helping lead the worker uprising at Google, founding an institute to rethink the ethics of AI, and promoting a platform for a real progressive politics in technology. Now, she's the president of non-profit organization Signal, which builds a messaging app of the same name known for its serious dedication to privacy. This episode, we talk with Whittaker about the current moment in tech, if privacy still matters, and what she can do to help Signal prosper, despite its Big Tech competition. Guests: Meredith Whittaker, president, the Signal Foundation
പ്രിയ സുഹൃത്തേ , Whatsapp മായി ബന്ധപ്പെട്ട് ഉയർന്നുവരുന്ന ചോദ്യങ്ങൾക്ക് Public Interest Technologist ആയ അനിവർ അരവിന്ദ് ദില്ലി ദാലിയോട് വിശദമായി സംസാരിക്കുകയാണ് ഈ സംഭാഷണത്തിൽ ഒന്ന് : Facebook ന്റെ എന്ത് ബിസിനസ് താൽപര്യങ്ങളാണ് ഈ പുതിയ നീക്കം വഴി നടക്കുന്നത് ? രണ്ട് : ഈ വിശ്വാസനഷ്ടത്തിന്റെ അനന്തരഫലങ്ങൾ എന്തൊക്കെ ? മൂന്ന് : Whatsapp ൽ ഒരാളുടെ social graph എങ്ങനെ രേഖപ്പെടുത്തിയിരിക്കുന്നു ? നാല് : ഇന്ത്യയിലെ സവിശേഷസാഹചര്യവും പരിമിതികളും ? അഞ്ച് : ഫേസ്ബുക്ക് എന്തിനാണ് Instagram വാങ്ങിയത് ? ആറ് : സ്വകാര്യതാനഷ്ടം മാത്രമാണോ പ്രശ്നം അതോ അതിലുമുപരിയോ ? ഏഴ് : digital ഇന്ത്യയുടെ digital governance പ്രശ്നങ്ങൾ എന്തൊക്കെ ? എട്ട് : Whatsapp Encryption ന് അടുത്തകാലത്ത് വന്ന വീഴ്ചകൾ എന്തൊക്കെ ? ഒൻപത് : Signal Foundation ഉണ്ടായ സാഹചര്യം ഏത് ? പത്ത് : Whatsapp നാം ഉപേക്ഷിക്കേണ്ടതുണ്ടോ ? അനിവർ അരവിന്ദ് സ്വതന്ത്ര മലയാള കമ്പ്യൂട്ടിങ്ങിൽ സജീവമാണ് . അദ്ദേഹം Software Free Law Center- India യുടെ ഉപദേശകസമിതി അംഗമാണ് . പോഡ്കാസ്റ്റിലേക്ക് സ്വാഗതം . ദൈർഘ്യം 25 മിനിറ്റ് . സ്നേഹപൂർവ്വം എസ് . ഗോപാലകൃഷ്ണൻ dillidalipodcast.com ഡൽഹി , 14 ജനുവരി 2021
Signal Las actualizaciones de la Política de privacidad de WhatsApp, propiedad de Facebook, han llevado a los usuarios descontentos a buscar aplicaciones alternativas. Y Elon Musk quiere que uses Signal en lugar de Facebook. Esto es lo que debe saber sobre la aplicación. Signal es una aplicación típica de instalación con un solo toque que se puede encontrar en los mercados como la Play Store de Google y la App Store de Apple. Es un desarrollo de código abierto proporcionado de forma gratuita por la organización sin fines de lucro Signal Foundation, y ha sido utilizado durante años por íconos de privacidad de alto perfil como Edward Snowden. La función principal de Signal es que puede hacer todo lo que hace WhatsApp, desde llamadas y chats grupales, hasta notas de voz y stickers. Cuando se trata de privacidad, es difícil superar la oferta de Signal. No almacena sus datos de usuario. Y más allá de su capacidad de cifrado, le brinda opciones de privacidad extendidas en pantalla, que incluyen bloqueos específicos de aplicaciones, ventanas emergentes de notificación en blanco, herramientas anti-vigilancia que difuminan el rostro y mensajes que desaparecen. Telegram La aplicación Telegram ofrece varias funciones. Al igual que en WhatsApp, obtienes los conceptos básicos como chats, chats grupales y canales. Sin embargo, a diferencia del límite de 256 miembros de WhatsApp, Telegram brinda soporte para grupos con hasta 200,000 miembros. También ofrece múltiples funciones específicas de grupo, como bots, encuestas, cuestionarios, hashtags y mucho más, que pueden hacer que las experiencias grupales sean mucho más divertidas. La aplicación también ofrece una función única, mensajes de autodestrucción (como Snapchat) que es excelente si envía mensajes que no desea que permanezcan en el dispositivo del destinatario por la eternidad. El límite de tamaño para compartir archivos en Telegram es la friolera de 1,5 GB. La aplicación ahora tiene llamadas de voz y video en dispositivos Android e iOS, lo cual es genial porque el soporte de videollamadas fue una gran omisión de la aplicación. --- Send in a voice message: https://anchor.fm/elgordocircuito/message
Computer security researcher Moxie Marlinspike is the creator of the encrypted messenger service Signal, and co-founder of the Signal Foundation: a nonprofit dedicated to global freedom of speech through the development of open-source privacy technology.
Get in touch with us Join the discussion about data science, machine learning and artificial intelligence on our Discord server Episode transcript We always hear the word “metadata”, usually in a sentence that goes like this Your Honor, I swear, we were not collecting users data, just metadata. Usually the guy saying this sentence is Zuckerberg, but could be anybody from Amazon or Google. “Just” metadata, so no problem. This is one of the biggest lies about the reality of data collection. F: Ok the first question is, what the hell is metadata? Metadata is data about data. F: Ok… still not clear. Imagine you make a phone call to your mum. How often do you call your mum, Francesco? F: Every day of course! (coughing) Good boy! Ok, so let's talk about today's phone call. Let's call “data” the stuff that you and your mum actually said. What did you talk about? F: She was giving me the recipe for her famous lasagna. So your mum's lasagna is the DATA. What is the metadata of this phone call? The lasagna has data of its own attached to it: the date and time when the conversation happened, the duration of the call, the unique hardware identifiers of your phone and your mum's phone, the identifiers of the two sim cards, the location of the cell towers that pinged the call, the GPS coordinates of the phones themselves. F: yeah well, this lasagna comes with a lot of data :) And this is assuming that this data is not linked to any other data like your Facebook account or your web browsing history. More of that later. F: Whoa Whoa Whoa, ok. Let's put a pin in that. Going back to the “basic” metadata that you describe. I think we understand the concept of data about data. I am sure you did your research and you would love to paint me a dystopian nightmare, as always. Tell us why is this a big deal? Metadata is a very big deal. In fact, metadata is far more “useful” than the actual data, where by “useful” I mean that it allows a third party to learn about you and your whole life. What I am saying is, the fact that you talk with your mum every day for 15 minutes is telling me more about you than the content of the actual conversations. In a way, the content does not matter. Only the metadata matters. F: Ok, can you explain this point a bit more? Imagine this scenario: you work in an office in Brussels, and you go by car. Every day, you use your time in the car while you go home to call your mum. So every day around 6pm, a cell tower along the path from your office to your home pings a call from your phone to your mum's phone. Someone who is looking at your metadata, knows exactly where you are while you call your mum. Every day you will talk about something different, and it doesn't really matter. Your location will come through loud and clear. A lot of additional information can be deduced from this too: for example, you are moving along a motorway, therefore you have a car. The metadata of a call to mum now becomes information on where you are at 6pm, and the way you travel. F: I see. So metadata about the phone call is, in fact, real data about me. Exactly. YOU are what is interesting, not your mum's lasagna. F: you say so because you haven't tried my mum's lasagna. But I totally get your point. Now, imagine that one day, instead of going straight home, you decide to go somewhere else. Maybe you are secretly looking for another job. Your metadata is recording the fact that after work you visit the offices of a rival company. Maybe you are a journalist and you visit your anonymous source. Your metadata records wherever you go, and one of these places is your secret meeting with your source. Anyone's metadata can be combined with yours. There will be someone who was with you at the time and place of your secret meeting. Anyone who comes in contact with you can be tagged and monitored. Now their anonymity has been reduced. F: I get it. So, compared to the content of my conversation, its metadata contains more actionable information. And this is the most useful, and most precious, kind of information about me. What I do, what I like, who I am, beyond the particular conversation. Precisely. If companies like Facebook or the phone companies had the explicit permission to collect all the users' data, including all content of conversations, it's still the metadata that would generate the most actionable information. They would probably throw the content of conversations away. In the vast majority of instances, the content does not matter. Unless you are an actual spy talking about state secrets, nobody cares. F: Let's stay on the spy point for a minute. One could say, So what? As I have heard this many times. So what if my metadata contains actionable information, and there are entities that collect it. If I am an honest person, I have nothing to hide. There are two aspects to the problem of privacy. Government surveillance, and corporate - in other words private - surveillance. Government surveillance is a topic that has been covered flawlessly by Edward Snowden in his book “Permanent Record”, and in the documentary about his activity, “Citizenfour”. Which I both recommend, and in fact I think every data scientist should read and watch. Let's just briefly mention the obvious: just because something comes from a government, it does not mean it's legal or legitimate, or even ethical or moral. What if your government is corrupt, or authoritarian. What if you are a dissident and you are fighting for human rights. What if you are a journalist, trying to uncover government corruption. F: In other words, it is a false equivalence to say that protecting your privacy has anything to do with having something to hide. Mass surveillance of private citizens without cause is a danger to individual freedom as well as civil liberties. Government exists to serve its citizens, not the other way around. To freely paraphrase Snowden, as individuals have no power compared to the government, the only way the system works is if the government is completely transparent to the citizens, so that they can collectively change it, and at the same time the single citizens are opaque to the government, so that it cannot abuse its power. But today the opposite happens: we citizens are completely naked and exposed in front of a completely opaque government machine, with secret surveillance programs on us, that we don't even know exist. We are not free to self-determine, or do anything about government power, really. F: We could really talk for days and days about government mass surveillance. But let's go back to metadata, and let's talk about the commercial use of it. Metadata for sale. You mentioned this term, “corporate surveillance”. It sounds…. Ominous. We live in privacy hell, Francesco. F: I get that. According to your research, where can we find metadata? First of all, metadata is everywhere. We are swimming in it. In each and every interaction between two people, that make use of digital technology, metadata is generated automatically, without the user's consent. When two people interact, two machines also interact, recording the “context” of this interaction. Who we are, when, where, why, what we want. F: And that doesn't seem avoidable. In fact metadata must be generated by devices and software to just work properly. I look at it as an intrinsic component that cannot be removed from the communication system, whatever it is. The problem is who owns it. So tell me, who has such data? It does not matter, because it's all for sale. Which means, we are for sale. F: Ok, holy s**t, this keeps getting darker. Let's have a practical example, shall we? Have you booked a flight recently? F: Yep. I'm going to Berlin, and in fact so are you. For a hackathon, no less. Have you ever heard of a company called Adara? F: No… Cannot say that I have. Adara is a “Predictive Traveler Intelligence” company. F: sounds pretty pretentious. Kinda douchy. This came up on the terrifying twitter account of Wolfie Christl, author among other things of a great report about corporate surveillance for Cracked Labs. Go check him out on twitter, he's great. F: Sure I will add what I find to the show notes of this episode. Oh and by the way you can find all this stuff on datascienceathome.com Sorry go ahead. Adara collects data - metadata - about travel-related online searches, purchases, devices, passenger records, loyalty program records. Data from clients that include major airlines, major airports, hotel chains and car rental chains. It creates a profile, a “traveler graph” in real time, for 750 million people around the world. A profile based on personal identifiers. F: uhh uhh Then what? Then Adara sells these profiles. F: Ok… I have to say, the box that I tick giving consent to the third-party use of my personal data when I use an airline website does not quite convey how far my data actually goes. Consent. LOL. Adara calculates a: “traveler value score” based on customer behaviour and needs across the global travel ecosystem, over time. The score is in the Salesforce Service Cloud, for sale to anyone. This score, and your profile, determine the personalisation of travel offers and treatment, before purchase, during booking, post purchase, at check in, in airport, at destination. In their own website, Adara explains how customer service agents for their myriad of clients - for example a front desk agent at a hotel - can instantly see the Traveler value score. Therefore they will treat you differently based on this score. F: Oh so if you have money to spend they will treat you differently The score is used to assess your potential value, to inform service and customer service strategies for you, as well as personalised messaging and relevant offers. And of course, the pricing you see when you look for flights. Low score? Prepare yourself to wait to have your call rerouted to a customer service agent. Would you ever tick a box to give consent to this? F: Fuck no. How is this even legal? What about the GDPR? It is, in fact, illegal. Adara is based in the US, but they collect data through data warehouses in the Netherlands. They claim they are GDPR-compliant. However, they collect all the data, and then decide on the specific business use, which is definitely not GDPR compliant. F: exactly! According to GDPR the user has to know in advance what the business use of the data they are giving consent for!! With GDPR and future regulations, there is a way to control how the data is used and with what purpose. Regulations are still blurred or undefined when it comes to metadata. For example, there's no regulation for the number of records in a database or the timestamp when such record was created. As a matter of fact data is useless without metadata. One cannot even collect data without metadata. Whatsapp, telegram, Facebook messenger... they all create metadata. So one might say “I've got end-to-end encryption, buddy”. Sure thing. How about the metadata attached to that encrypted gibberish nobody is really interested in? To show you how unavoidable the concept of metadata is, even Signal developed by the Signal Foundation which is considered the truly end-to-end and open source protocol for confidential information exchange, can see metadata. At Signal they claim they just don't keep it, as they also state in the Signal's privacy policy. "Certain information (e.g. a recipient's identifier, an encrypted message body, etc.) is transmitted to us solely for the purpose of placing calls or transmitting messages. Unless otherwise stated below, this information is only kept as long as necessary to place each call or transmit each message, and is not used for any other purpose." This is one of those issues that shall be solved with legislation. But like money laundering, your data is caught in a storm of transactions so intricate that at a certain point, how do you even check... All participating companies share customer data with each other (a process called value exchange). They let marketers utilize the data, for example to target people after they have searched for flights or hotels. Adara creates audience segments and sells them, for example to Google, for advertisement targeting. The consumer data broker LiveRamp for example lists Adara as a data provider. F: consumer data broker. I am starting to get what you mean when you say that we are for sale. Let's talk about LiveRamp, part of Acxiom. F: there they go... Acxiom... I heard of them They self-describe as an “Identity Resolution Platform”. F: I mean, George Orwell would be proud. Their mission? “To connect offline data and online data back to a single identifier”. In other words, clients can “resolve all” of their “offline and online identifiers back to the individual consumer”. Various digital profiles, like the ones generated on social media or when you visit a website, are matched to databases which contains names, postal addresses, email addresses, phone numbers, geo locations and IP addresses, online and mobile identifiers, such as cookie and device IDs. F: well, all this stuff is possible if and only if someone gets in possession of all these profiles, or well... they purchase them. Still, what the f**k. A cute example? Imagine you register on any random website but you don't want to give them your home address. They just buy it from LiveRamp, which gets it from your phone geolocation data - which is for sale. Where does your phone sit still for 12 hours every night? That's your home address. Easy. F: And they definitely know how much time do I spend at the gym, without even checking my Instagram! Ok this is another level of creepy. Clients of LiveRamp can upload their own consumer data to the platform, combine it with data from hundreds of 100 third-party data providers, and then utilize it on more than 500 marketing technology platforms. They can use this data to find and target people with specific characteristics, to recognize and track consumers across devices and platforms, to profile and categorize them, to personalize content for them, and to measure how they behave. For example, clients could “recognize a website visitor” and “provide a customized offer” based on extensive profile data, without requiring said user to log in to the website. Furthermore, LiveRamp has a data store, for other companies to “buy and sell valuable customer data”. F: What is even the point of giving me the choice to consent to anything online? In short, there is no point. F: it seems we are so behind with regulations on data sharing. GDPR is not cutting it, not really. With programmatic advertising we have created a monster that has really grown out of control. So: our lives are completely transparent to private corporations, that constantly surveil us en-masse, and exploit all of our data to sell us shit. How does this affect our freedom? How about we just don't buy it? Can it be that simple? And I would not take a no for an answer here. Unfortunately, no. F: oh crap! I'm going to read you a passage from Permanent Record: Who among us can predict the future? Who would dare to? The answer to the first question is no one, really, and the answer to the second is everyone, especially every government and business on the planet. This is what that data of ours is used for. Algorithms analyze it for patterns of established behaviour in order to extrapolate behaviours to come, a type of digital prophecy that's only slightly more accurate that analog methods like palm reading. Once you go digging into the actual technical mechanisms by which predictability is calculated, you come to understand that its science is, in fact, anti-scientific, and fatally misnamed: predictability is actually manipulation. A website that tells you that because you liked book 1 then you might also like book 2, isn't offering an educated guess as much as a mechanism of subtle coercion. We can't allow ourselves to be used in this way, to be used against the future. We can't permit our data to be used to sell us the very things that must not be sold, such as journalism. [....] We can't let the god-like surveillance we're under be used to “calculate” our citizenship scores, or to “predict” our criminal activity; to tell us what kind of education we can have, or what kind of job we can have [...], to discriminate against us based on our financial, legal, and medical histories, not to mention our ethnicity or race, which are constructs that data often assumes or imposes. [...] if we allow [our data] to be used to identify us, then it will be used to victimize us, even to modify us - to remake the very essence of our humanity in the image of the technology that seeks its control. Of course, all of the above has already happened. F: In other words, we are surveilled and our data collected, and used to affect every aspect of our lives - what we read, what movies we watch, where we travel, what we buy, who we date, what we study, where we work… This is a self-fulfilling prophecy for all of humanity, and the prophet is a stupid, imperfect algorithm optimised just to make money. So I guess my message of today for all Data Scientists out there is this: just… don't. References https://github.com/signalapp Wolfie Christl report https://crackedlabs.org/en/corporate-surveillance wolfie.crackedlabs.org
The cofounder of WhatsApp and the Signal Foundation thinks the use of encrypted communications tools will only increase in the future. “There's a global education that's happening,” says Brian Acton, who left WhatsApp in 2018 and now chairs the non-profit foundation, which promotes open-source, end-to-end encryption in messaging. “Back in the ‘90s, we all got the same hoax emails, and we all learned to ignore them.