Podcasts about guillaume chaslot

  • 20PODCASTS
  • 22EPISODES
  • 34mAVG DURATION
  • ?INFREQUENT EPISODES
  • Jun 3, 2024LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about guillaume chaslot

Latest podcast episodes about guillaume chaslot

radinho de pilha
o legendário exército de 150 casais, somos 8 bilhões de cobaias? a internet bagunçou a tribo!

radinho de pilha

Play Episode Listen Later Jun 3, 2024


Down the Rabbit Hole by Design — with Guillaume Chaslot https://youtu.be/b9DyYUA0dog?si=2r5hXekdoubun5rK The Internet's Final Frontier: Remote Amazon Tribes https://www.nytimes.com/2024/06/02/world/americas/starlink-internet-elon-musk-brazil-amazon.html The Sacred Band of Thebes https://www.hellenic.org.au/post/the-sacred-band-of-thebes Muitas vezes sem acolhimento na própria família, a população LGBT+ encontra um espaço de proteção, alegria e pertencimento em espaços culturais. https://x.com/GloboNews/status/1796986760376762783 The elite ancient Greek fighting force made up ... Read more

Your Undivided Attention
Synthetic Humanity: AI & What's At Stake

Your Undivided Attention

Play Episode Listen Later Feb 16, 2023 46:25


It may seem like the rise of artificial intelligence, and increasingly powerful large language models you may have heard of, is moving really fast… and it IS. But what's coming next is when we enter synthetic relationships with AI that could come to feel just as real and important as our human relationships... And perhaps even more so. In this episode of Your Undivided Attention, Tristan and Aza reach beyond the moment to talk about this powerful new AI, and the new paradigm of humanity and computation we're about to enter. This is a structural revolution that affects way more than text, art, or even Google search. There are huge benefits to humanity, and we'll discuss some of those. But we also see that as companies race to develop the best synthetic relationships, we are setting ourselves up for a new generation of harms made exponentially worse by AI's power to predict, mimic and persuade.It's obvious we need ways to steward these tools ethically. So Tristan and Aza also share their ideas for creating a framework for AIs that will help humans become MORE humane, not less.RECOMMENDED MEDIA Cybernetics: or, Control and Communication in the Animal and the Machine by Norbert WienerA classic and influential work that laid the theoretical foundations for information theoryNew Chatbots Could Change the World. Can You Trust Them?The New York Times addresses misinformation and how Siri, Google Search, online marketing and your child's homework will never be the sameOut of One, Many: Using Language Models to Simulate Human Samples by Lisa P. Argyle, Ethan C. Busby, Nancy Fulda, Joshua Gubler, Christopher Rytting, David WingateThis paper proposes and explores the possibility that language models can be studied as effective proxies for specific human sub-populations in social science researchEarth Species ProjectEarth Species Project, co-founded by Aza Raskin, is a non-profit dedicated to using artificial intelligence to decode non-human communicationHer (2013)A science-fiction romantic drama film written, directed, and co-produced by Spike JonzeWhat A Chatty Monkey May Tell Us About Learning To TalkNPR explores the fascinating world of gelada monkeys and the way they communicateRECOMMENDED YUA EPISODESHow Political Language is Engineered with Drew Westen & Frank LuntzWhat is Humane Technology?Down the Rabbit Hole by Design with Guillaume Chaslot 

Sur le fil
[HORS SERIE] Guillaume Chaslot: ces algorithmes qui nous gouvernent

Sur le fil

Play Episode Listen Later Aug 19, 2022 25:58


Guillaume Chaslot est l'un des premiers lanceurs d'alerte sur les excès de la tech. Cet ingénieur spécialiste en intelligence artificielle qui a notamment travaillé chez Google, a vécu à Los Angeles l'âge d'or de la tech, quand Google, YouTube ou encore Facebook se développaient à vive allure. L'ambiance était à l'euphorie et les développeurs comme Guillaume Chaslot pensaient qu'ils allaient changer le monde, en permettant à tous d'échanger et de s'instruire sans frontières. Il s'est soucié très tôt des conséquences des algorithmes de recommandation, réalisant qu'ils enfermaient les utilisateurs dans des "bulles de filtre", qui leur proposaient toujours les mêmes contenus.  En 2016, il a réalisé que ces algorithmes favorisaient aussi des contenus toxiques  comme les théories du complot. Il a créé un robot informatique qui a permis de démontrer cette mécanique. Il est le fondateur de Algotransparency.org, qui s'est concentré sur l'algorithme de YouTube. “Les algorithmes de recommandation sont devenus un espèce de Frankenstein, c'est un espèce de monstre qui a échappé à son contrôle. On ne sait pas ce qu'il fait, on ne sait pas pourquoi il le fait, on sait juste qu'il génère plus de temps de vue”, nous a-t-il expliqué. Guillaume Chaslot est donc très critique à l''égard de Google, YouTube ou encore TikTok, que nous avons interrogés.  Voici quelques éléments d'information sur ces entretiens, en plus de ceux cités dans notre podcast: En 2013, Guillaume Chaslot a été licencié par Google.  L'entreprise évoque des performances insuffisantes. Pour sa part il évoque des divergences, car il passait trop de temps à développer des alternatives aux "bulles de filtre" qui n'intéressaient pas son employeur.  Google et TikTok ont souligné que leur algorithme avait évolué depuis les recherches de Guillaume Chaslot.  Google a signalé à notre correspondante à San Francisco Julie Jammot que  la consommation de contenu problématique lié à l'algorithme de recommandation est désormais en dessous de 1%. Google a par ailleurs indiqué que Guillaume Chaslot n'avait pas d'accès privilégié aux données sur l'algorithme de recommandation et dément que l'entreprise ait ignoré ses observations quand il y travaillait. TikTok pour sa part a déclaré avoir introduit des fonctionnalités pour permettre aux utilisateurs de ne pas devenir “accros” à la plateforme et les inviter à ne pas diffuser ou visionner en boucle des contenus toxiques:  le  "screentime tool" pour décider combien de temps y passer par jour ou encore les "family features", pour aider les parents  à avoir des conversations avec leurs ados et décider combien de temps ils passent sur la plateforme ou encore, des infos sur la santé mentale pour aider les utilisateurs à choisir quels contenus partager.  TikTok signale aussi que les contenus qui glorifient le suicide sont interdits et  travaille à limiter le nombre de "vidéos tristes". L'entreprise vient enfin d'annoncer son intention de partager davantage de données sur ses algorithmes avec des chercheurs . Aujourd'hui Guillaume Chaslot  travaille au sein du Pôle d'expertise qui conseille le gouvernement sur la régulation du numérique, le Peren. Il plaide pour plus de transparence pour ces applications à l'impact immense sur la société. Réalisation: Michaëla Cancela-Kieffer. Avec Timothée David et Antoine Boyer pour le mixage. Musique originale: Michael Liot. Vignette: Fred Bourgeais. Communication: Coline Sallois.  Merci à Dan Reed pour la photo de Guillaume Chaslot. Slow Fil est produit par l'équipe de Sur le Fil, le podcast quotidien de l'Agence France-Presse. Vous avez des commentaires ? Ecrivez-nous à podcast@afp.com ou sur notre compte Instagram. Vous pouvez aussi nous envoyer une note vocale par Whatsapp au + 33 6 79 77 38 45.

Your Undivided Attention
The Invisible Influence of Language

Your Undivided Attention

Play Episode Listen Later Feb 24, 2022 40:19


One of the oldest technologies we have is language. How do the words we use influence the way we think?The media can talk about immigrants scurrying across the border, versus immigrants crossing the border. Or we might hear about technology platforms censoring us, versus moderating content. If those word choices shift public opinion on immigration or technology by 25%, or even 2%, then we've been influenced in ways we can't even see. Which means that becoming aware of how words shape the way we think can help inoculate us from their undue influence. And further, consciously choosing or even designing the words we use can help us think in more complex ways – and address our most complex challenges.This week on Your Undivided Attention, we're grateful to have Lera Boroditsky, a cognitive scientist who studies how language shapes thought. Lera is an Associate Professor of Cognitive Science at UC San Diego, and the editor-in-chief of Frontiers in Cultural Psychology.Clarification: in the episode, Aza refers to Elizabeth Loftus' research on eyewitness testimony. He describes an experiment in which a car hit a stop sign, but the experiment actually used an example of two cars hitting each other.RECOMMENDED MEDIA How language shapes the way we thinkLera Boroditsky's 2018 TED talk about how the 7,000 languages spoken around the world shape the way we thinkMeasuring Effects of Metaphor in a Dynamic Opinion LandscapeBoroditsky and Paul H. Thibodeau's 2015 study about how the metaphors we use to talk about crime influence our opinions on how to address crime Subtle linguistic cues influence perceived blame and financial liabilityBoroditsky and Caitlin M. Fausey's 2010 study about how the language used to describe the 2004 Super Bowl "wardrobe malfunction" influence our views on culpabilityWhy are politicians getting 'schooled' and 'destroyed'?BBC article featuring the research of former Your Undivided Attention guest Guillaume Chaslot, which shows the verbs YouTube is most likely to include in titles of recommended videos — such as "obliterates" and "destroys"RECOMMENDED YUA EPISODES Mind the (Perception) Gap: https://www.humanetech.com/podcast/33-mind-the-perception-gapCan Your Reality Turn on a Word?: https://www.humanetech.com/podcast/34-can-your-reality-turn-on-a-wordDown the Rabbit Hole by Design: https://www.humanetech.com/podcast/4-down-the-rabbit-hole-by-design

The Infotagion Podcast with Damian Collins MP
Online Safety Bill special series: Laura Edelson, Renée diResta and Guillaume Chaslot

The Infotagion Podcast with Damian Collins MP

Play Episode Listen Later Oct 25, 2021 21:10


How do user interactions shape content algorithms?  Are vulnerable people being purposefully targeted with harmful content? And how realistic is it that harmful content will be taken down before it causes harm? In this Online Safety Bill special series, Damian Collins MP unpacks the evidence heard from Laura Edelson, Guillaume Chaslot, and Renee diResta followed by a discussion with committee member Lord (Tim) Clement-Jones. 

Futurables
Algorithmes, et si on reprenait le contrôle ? - avec Guillaume Chaslot, expert IA et Datascience du PEReN

Futurables

Play Episode Listen Later Oct 12, 2021 6:21


Est-il encore temps aujourd'hui de sortir du règne de la recommandation personnalisée ? Guillaume Chaslot a travaillé trois ans chez Youtube & Google à Los Angeles, sur l'algorithme d'enchaînement des vidéos, qui recommande les vidéos sur la droite de la vidéo en cours. Il explique que plus de 70% des vidéos vues sur Youtube sont liées à cet algorithme, ce qui manipule le temps d'attention et enferme les utilisateurs dans des bulles de filtre. À l'heure de l'IA, comment peut-on intégrer la notion d'éthique dès la conception des algorithmes ? Hébergé par Ausha. Visitez ausha.co/politique-de-confidentialite pour plus d'informations.

She Likes Tech - der Podcast über Technologie
Investigativ Spezial: Im Kaninchenbau (2/4)

She Likes Tech - der Podcast über Technologie

Play Episode Listen Later Aug 3, 2021 29:37


Der Fitness YouTuber „Coach Cecil“ hatte zu Beginn der Pandemie erste Zweifel bei Selina Fullert daran geweckt, wie gefährlich das Coronavirus wirklich ist. Als sie beginnt, im Netz nach Informationen über das Virus zu suchen, gerät sie immer tiefer hinein in die Welt aus Skepsis, Zweifeln und Geschichten über das Coronavirus - bis sie irgendwann selbst als "Querdenken Hamburg"-Initiatorin Demos organisiert. In dieser Spezial-Episode „She Likes Tech – Investigativ“ steigen wir mit euch und Selina Fullert hinab ins „Corona-Rabbit Hole“, sprechen mit dem ehemaligen YouTube-Mitarbeiter Guillaume Chaslot, der erzählt, wie im Silicon Valley alles begonnen hat und wir verfolgen Selina Fullerts YouTube-Verlauf von den ersten Videos von Coach Cecil bis zu ihren Hochzeiten als Querdenken-Demo-Organisatorin. Es ist die zweite Spezial-Folge von vier „She Likes Tech - Investigativ“-Episoden, die ihr hier, wie gewohnt, im zwei Wochentakt hören könnt. Wir hoffen, euch gefällt unser Spezial, wünschen euch viel Spaß beim Hören und wie immer gilt:   Wenn ihr Fragen habt oder Anregungen schreibt uns jederzeit per Mail: shelikestech[at]ndr[punkt]de Wissenschaft im Corona-Jahr - zwischen Vertrauen und Skepsis: https://www.bosch-stiftung.de/de/news/wissenschaft-im-corona-jahr-zwischen-vertrauen-und-skepsis Coach Cecil: https://de.wikipedia.org/wiki/Cecil_Egwuatu https://www.zdf.de/nachrichten/panorama/coronavirus-coach-cecil-mythen-100.html   Rabbit Hole: https://www.newyorker.com/culture/cultural-comment/the-rabbit-hole-rabbit-hole https://www.youtube.com/watch?v=q3qXEAzqQbI You Tube Algorithmus: https://techcrunch.com/2012/10/12/youtube-changes-its-search-ranking-algorithm-to-focus-on-engagement-not-just-clicks/?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAAB6lOksCReuTRxWOaVsv6OXbqmwFgaQLaJyRfjWT_ZH6bOgTacHAxWF5Tw57p-Mqci1hSQaCum1Ecizvz988MxPC0XGt_y5PHFGQgfDqpjs377uHPzXbndNDgRkNt92x8TRrshEC_RSJGHTp_CTidO6309A1yt3hwEOKpPb8EYgH https://www.klicksafe.de/service/aktuelles/news/detail/studie-welche-inhalte-machen-die-empfehlungsalgorithmen-von-youtube-sichtbar/ https://www.tubefilter.com/2016/06/23/reverse-engineering-youtube-algorithm/ Guillaume Chaslot: https://www.theguardian.com/technology/2018/feb/02/how-youtubes-algorithm-distorts-truth https://www.dw.com/de/warum-bevorzugt-youtube-extreme-videoclips/av-50326886 Weiter gucken in der ARD-Doku: https://www.ardmediathek.de/video/reportage-und-dokumentation/die-story-im-ersten-querdenker/das-erste/Y3JpZDovL2Rhc2Vyc3RlLmRlL3JlcG9ydGFnZSBfIGRva3VtZW50YXRpb24gaW0gZXJzdGVuLzcwZDllMmNjLTNiNjYtNDgwOC1hMmZjLTRlNjRhNDdiMzUxMA/ Weiter gucken bei strg+f auf YouTube: https://www.youtube.com/channel/UCfa7jJFYnn3P5LdJXsFkrjw Weiter lesen im NDR-Webdossier: https://story.ndr.de/querdenker/index.html

On Point
The Perilous Power Of Social Media Platforms

On Point

Play Episode Listen Later Feb 4, 2021 46:54


Social media platforms have immense power, from shutting down voices to amplifying what we see. But is that singular power perilous to democracy? Shoshana Zuboff, Guillaume Chaslot and Ramesh Srinivasan join Meghna Chakrabarti.

Gospel Tech
37. The Social Dilemma

Gospel Tech

Play Episode Listen Later Oct 13, 2020 41:47


Here are the promised resources for learning more on this subject: The Techwise Family, Andy Crouch Digital Minimalism, Cal Newport Glow Kids, Nicholas Kardaras The Shallows, Nicholas Carr The Four, Scott Galloway 12 Ways Your Phone is Changing You, Tony Reinke The Gospel brings us back to hope - so we don't stare at tech and go “Oh man, it's gonna be bad, better run away.” Instead we say “wow, this is gonna be bad, we'd better stay close.” This film includes interviews with lots of people who have a deep understanding of what's going on in the world of tech development. Here are just a few: -Tristan Harris, former Google ethicist and co-founder of the Center for Humane Tech. -Shoshana Zuboff, PhD, Harvard Business School, Author: Age of Surveillance Capitalism -Jeff Seibert (former Twitter exec) -Roger McNamees, FB venture capitalist -Jonathan Haidt, PhD, NYU Stern School of Business, author: The Rightoues Mind: Why Good People Are Divided on Politics and Religion -Aza Raskin - inventor of the infinite scroll -Cathy O'Neil, PhD, Data Scientist, Author: Weapons of Math Destruction -Sandy Parakilas, former FB / Uber management -Guillaume Chaslot, (former youtube engineer, IntuitiveAI CEO, AlgoTransparency, FOUNDER) This video is well worth a watch. The information is pertinent and powerful. It's worth a chance to reflect. Three major take-aways that make me believe everyone should watch it, are: This tech conversation is complicated - in the first minutes they ask "What is the problem” and every person stumbles over their words. It's not that there isn't a problem, it's simply a one-word answer, or one sentence, feels cheap. It will not resolve itself - there is literally no incentive for trillion-dollar companies to simply give up on their trade. There is hope - lots of it. But without action that hope will be squandered I'd add one more we have for this conversation - we know from whom we get our hope. It is the definition of unloving to be indifferent to the helpless and the needy. Our kids, our grandkids, our friends and family, even our enemies, are being eaten alive by tech that is simply smarter than they are. It's costing us all as a country, sure, but more importantly it's impacting the hearts and minds of people who need to know their purpose and be shown the great love God has for them.

Path & Purpose
Guillaume Chaslot - Ethical AI, Digital Awareness, YouTube's Rabbit Hole Recommendation Algorithm

Path & Purpose

Play Episode Listen Later Sep 15, 2020 36:57


Guillaume Chaslot is a Doctor of Artificial Intelligence who has worked at Microsoft, Google, and Mozilla. He was on the team responsible for developing the YouTube Recommendation Algorithm. In this episode we explore how said algorithms can seize and mislead human attention. Think rabbit holes. Conspiracy theories. Divisive content. What do we need to know today to stay safe and aware online? What can we do to engage with each other in more socially just and healthier ways at work, in product development and at home to nurture a truly connected world? Following his time at Google, Guillaume created AlgoTransparency.org, a project on a quest to understand how certain questionable content is pushed through YouTube's AI and how to turn the tide toward integrity and improve the future of humanity. https://algotransparency.org https://twitter.com/gchaslot?lang=en

Enlaces: Ventana abierta al mundo digital
¿Por qué prefiere YouTube los videos extremos?

Enlaces: Ventana abierta al mundo digital

Play Episode Listen Later May 21, 2020 1:07


El algoritmo de YouTube evalúa el tiempo de visionado y las reacciones, pero no la calidad del contenido. Los clips extremos generan más permanencia y permiten mucha publicidad. Guillaume Chaslot exige un replanteamiento.

Good Code
The Tech Worker Uprising

Good Code

Play Episode Listen Later Jan 7, 2020 14:34


In 2018 and 2019, tech workers stood up, pushed back and said no, to projects and working environments they disagreed with. At Amazon, Google, Microsoft: they said no to government contracts, no to sexual harassment. Are we witnessing a revolution? A unique movement led by a new generation of data scientists with a strong conscience? We explore this question with Guillaume Chaslot, who worked on YouTube's controversial recommendation algorithm, and with computer science PhD students.

Best of the Left - Leftist Perspectives on Progressive Politics, News, Culture, Economics and Democracy
#1322 Freedom of Speech vs Freedom of Reach (Regulating Facebook)

Best of the Left - Leftist Perspectives on Progressive Politics, News, Culture, Economics and Democracy

Play Episode Listen Later Dec 4, 2019 81:45


Air Date 12/4/2019 Today we take a look at some solutions to our social media problems of fake news, propaganda and hate in all forms spreading like wildfire through society Be part of the show! Leave us a message at 202-999-3991   EPISODE SPONSORS: Madison-Reed.com (Promo Code: LEFT) | Clean Choice Energy SHOP AMAZON: Amazon USA | Amazon CA | Amazon UK  MEMBERSHIP ON PATREON (Get AD FREE Shows & Bonus Content!) SHOW NOTES Ch. 1: Are Democrats Breaking Up Big Tech with Tim Wu - The Takeaway Politics with Amy Walter - Air Date 10-25-19 Amy Walter interviews several experts about the role of big tech in politics and policy in our society. Ch. 2: FaceBOO - BackTalk with Dahlia and Amy - Air Date 10-31-19 Dahlia and Amy discuss Zuckerberg and Facebook and their unstoppable capitalist power and right-wing centrism. Ch. 3: Mark Zuckerberg Thinks He's A Free Speech Advocate with Kate Klonick - On The Media - Air Date 10-25-19 Kate Klonick discusses AOC questioning Zuckerberg about Facebook's role in free speech and politics. Ch. 4: Down The Rabbit Hole By Design with Guillaume Chaslot - Your Undivided Attention with Tristan Harris and Aza Raskin - Air Date 7-10-19 Aza, Tristan and Guillaume Chaslot discuss the vicious internet algorithms that are draining the life of humans. Ch. 5: Are Democrats Breaking Up Big Tech with Cecelia King - The Takeaway Politics with Amy Walter - Air Date 10-25-19 Amy Walter interviews several experts about the role of big tech in politics and policy in our society. Ch. 6: With Great Power Comes No Responsibility - Your Undivided Attention with Aza Raskin and Tristan Harris - Air Date 6-25-19 Aza, Tristan and Yael Eisenstat discuss the massive power Facebook has over the lives and policies of the American people. VOICEMAILS Ch. 7: Christian missionary colonization - Heather from Colorado Ch. 8: The curation is needed - Mark in Colorado Ch. 9: Monthy contributor - Mark in Colorado Ch. 10: First-time monthly member - Stacy from San Francisco Bay Area FINAL COMMENTS Ch. 11: Final comments on how human curation is like Tinkerbell MUSIC (Blue Dot Sessions): Opening Theme: Loving Acoustic Instrumental by John Douglas Orr  Begrudge - Darby Inessential - Bayou Birds Waterbourne - Algea Fields Jackbird - Feathers Gondola Blue - Towboat Voicemail Music: Low Key Lost Feeling Electro by Alex Stinnent Closing Music: Upbeat Laid Back Indie Rock by Alex Stinnent   Produced by Jay! Tomlinson Thanks for listening! Visit us at BestOfTheLeft.com Support the show via Patreon Listen on iTunes | Stitcher | Spotify | Alexa Devices | +more Check out the BotL iOS/Android App in the App Stores! Follow at Twitter.com/BestOfTheLeft Like at Facebook.com/BestOfTheLeft Contact me directly at Jay@BestOfTheLeft.com Review the show on iTunes and Stitcher!

Probablement?
Rendre YouTube bénéfique avec Guillaume Chaslot | Probablement?

Probablement?

Play Episode Listen Later Sep 6, 2019 105:04


J'accueille Guillaume Chaslot, docteur en IA, ancien développeur de l'algorithme de YouTube, créateur de AlgoTransparency et actuellement Mozilla Fellow. Retrouvez Guillaume sur Internet ! https://twitter.com/gchaslot/ https://algotransparency.org/ https://www.youtube.com/watch?v=9uoKpp_jISA La recommandation de Guillaume (que je plussoie vivement !) Your Undivided Attention Podcast : https://humanetech.com/podcast/ Facebook : https://www.facebook.com/Science4Allorg/ Twitter : https://twitter.com/le_science4all Tipeee : https://www.tipeee.com/science4all Mes goodies : https://shop.spreadshirt.fr/science4all Mes dates à venir : https://www.dropbox.com/s/t3abghdmh5964sx/Actu.txt?dl=0 La formule du savoir (mon livre) : https://laboutique.edpsciences.fr/produit/1035/9782759822614/La%20formule%20du%20savoir A Roadmap for the Value-Loading Problem https://arxiv.org/abs/1809.01036 Probablement? en audio : http://playlists.podmytube.com/UC0NCbj8CxzeCGIF6sODJ-7A/PLtzmb84AoqRQ0ikLb4yC4lKgjeDEIpE1i.xml Moi en podcast avec Mr Phi : Version YouTube : https://www.youtube.com/channel/UCNHFiyWgsnaSOsMtSoV_Q1A Version Audio : http://feeds.feedburner.com/Axiome Sous-titres sur les autres vidéos : http://www.youtube.com/timedtext_cs_panel?tab=2&c=UC0NCbj8CxzeCGIF6sODJ-7A

Your Undivided Attention
Down the Rabbit Hole by Design

Your Undivided Attention

Play Episode Listen Later Jul 10, 2019 54:29


When we press play on a YouTube video, we set in motion an algorithm that taps all available data to find the next video that keeps us glued to the screen. Because of its advertising-based business model, YouTube’s top priority is not to help us learn to play the accordion, tie a bow tie, heal an injury, or see a new city — it’s to keep us staring at the screen for as long as possible, regardless of the content. This episode’s guest, AI expert Guillaume Chaslot, helped write YouTube’s recommendation engine and explains how those priorities spin up outrage, conspiracy theories and extremism. After leaving YouTube, Guillaume’s mission became shedding light on those hidden patterns on his website, AlgoTransparency.org, which tracks and publicizes YouTube recommendations for controversial content channels. Through his work, he encourages YouTube to take responsibility for the videos it promotes and aims to give viewers more control.

Trending
How YouTube decides what you should watch

Trending

Play Episode Listen Later May 24, 2019 22:34


Why are there so many conspiracy videos on YouTube? The company has clamped down on extremist and dangerous content, but conspiracies, outright fakes, and hoaxes are still very easy to find. Sometimes they’re only watched by a few people, but often these videos go viral. The reason why they so often pop up on your screen, says former Google employee Guillaume Chaslot, is YouTube’s algorithm. Chaslot was one of the engineers who helped shape the YouTube recommendation engine, the mechanism that determines which videos the site suggests you watch next. He was sacked in 2013, and since then he has become a critic of the company. He now says that YouTube’s obsession with keeping people watching has turned the platform into an incubator for false, incendiary, and sensationalist content – and this, in turn, is having a very real impact on the world we live in. Presenter: Marco Silva Photo caption: YouTube logo on a smartphone Photo credit: Getty Images

Business Daily
Youtube: Cracking down on crackpots

Business Daily

Play Episode Listen Later Apr 29, 2019 17:50


What does the video-sharing site needs to do in order to stop inadvertently promoting dangerous conspiracy theories and extremist content? Alex Jones's InfoWars channel (pictured) - which among other things propagated the lie that the Sandy Hook school shooting in the US was faked - has already been banned from YouTube, although his videos still find their way onto the site. Meanwhile the social media platform has also been clamping down on the vaccination conspiracists blamed for causing the current measles epidemic, as well as the far right extremists said to have inspired terrorists such as the New Zealand mosque shooter. But is the tougher curating of content enough? Or does YouTube's very business model depend on the promotion of sensationalism and extremism by its algorithms? Ed Butler speaks to Mike Caulfield of the American Democracy Project, former Youtube engineer Guillaume Chaslot, and Joan Donovan, who researches the Alt Right at Harvard. (Picture: Screenshot of an Alex Jones InfoWars video on YouTube, taken on 29 April 2019, despite the banning of his channel by YouTube)

Techdirt
Free Speech & Content Moderation (Panel Discussion)

Techdirt

Play Episode Listen Later Oct 9, 2018 63:02


For this week's episode of the podcast, we're featuring a recent panel discussion from Mozilla's Speaker Series. Mike Masnick sat down with Guillaume Chaslot from Algo Transparency, hosted by Mozilla Fellow in Residence Renée DiResta, to talk about the challenges of online content moderation and its implications for freedom of expression. Enjoy!

O'Reilly Data Show - O'Reilly Media Podcast
The importance of transparency and user control in machine learning

O'Reilly Data Show - O'Reilly Media Podcast

Play Episode Listen Later Apr 12, 2018 23:19


In this episode of the Data Show, I spoke with Guillaume Chaslot, an ex-YouTube engineer and founder of AlgoTransparency, an organization dedicated to helping the public understand the profound impact algorithms have on our lives. We live in an age when many of our interactions with companies and services are governed by algorithms. At a […]

El Siglo 21 es Hoy
¿Cómo retiene YouTube a la audiencia?

El Siglo 21 es Hoy

Play Episode Listen Later Feb 5, 2018 16:33


Procrastinar viendo vídeos en YouTube es algo familiar ya para gran parte de la población mundial. Y también algo cuestionable tanto a nivel personal como de sistema social. ¿Cuánto tiempo perdemos? Y quién gana con eso? YouTube? Nos manipula? Y más importante aún ¿esto nos llevará a buen puerto o nos echará a perder como sociedad. Hablemos del algoritmo para intentar retenernos frente a la pantalla y de las críticas que Guillaume Chaslot, un ex-programador de YouTube ha revelado. Según algunas noticias Alphabet (nombre de la empresa matriz de Google y de YouTube) habría tenido pérdidas de 3 mil millones de dólares en el último trimestre de 2017. Y eso, me aventuro yo a interpretar, podría estar conectado con contabilidad de impuestos y manejos de acciones en bolsa, pero no debemos olvidar los ingresos por publicidad en el buscador; y quizás también nos de pistas sobre lo que sucede con YouTube. Noticia en Canaltech:https://canaltech.com.br/resultados-financeiros/google-registra-perda-liquida-de-us-3-bilhoes-no-quarto-trimestre-de-2017-107772/Noticia en Swissinfo:https://www.swissinfo.ch/spa/p%C3%A9rdida-neta-de-usd-3.000-millones-en-4%C2%BA-trimestre-para--alphabet--matriz-de-google/43870448Noticia en Gembeta:https://www.genbeta.com/redes-sociales-y-comunidades/el-algoritmo-tiene-sesgos-perturbadores-y-peligrosos-un-ex-ingeniero-de-youtube-critica-como-se-fabrican-los-videos-recomendados

@LocutorCo Blog / Podcast en ELTIEMPO.com
¿Cómo retiene YouTube a la audiencia?

@LocutorCo Blog / Podcast en ELTIEMPO.com

Play Episode Listen Later Feb 5, 2018 16:33


Procrastinar viendo vídeos en YouTube es algo familiar ya para gran parte de la población mundial. Y también algo cuestionable tanto a nivel personal como de sistema social. ¿Cuánto tiempo perdemos? Y quién gana con eso? YouTube? Nos manipula? Y más importante aún ¿esto nos llevará a buen puerto o nos echará a perder como sociedad. Hablemos del algoritmo para intentar retenernos frente a la pantalla y de las críticas que Guillaume Chaslot, un ex-programador de YouTube ha revelado. Según algunas noticias Alphabet (nombre de la empresa matriz de Google y de YouTube) habría tenido pérdidas de 3 mil millones de dólares en el último trimestre de 2017. Y eso, me aventuro yo a interpretar, podría estar conectado con contabilidad de impuestos y manejos de acciones en bolsa, pero no debemos olvidar los ingresos por publicidad en el buscador; y quizás también nos de pistas sobre lo que sucede con YouTube. Noticia en Canaltech:https://canaltech.com.br/resultados-financeiros/google-registra-perda-liquida-de-us-3-bilhoes-no-quarto-trimestre-de-2017-107772/Noticia en Swissinfo:https://www.swissinfo.ch/spa/p%C3%A9rdida-neta-de-usd-3.000-millones-en-4%C2%BA-trimestre-para--alphabet--matriz-de-google/43870448Noticia en Gembeta:https://www.genbeta.com/redes-sociales-y-comunidades/el-algoritmo-tiene-sesgos-perturbadores-y-peligrosos-un-ex-ingeniero-de-youtube-critica-como-se-fabrican-los-videos-recomendados

• El siglo 21 es hoy •
¿Cómo retiene YouTube a la audiencia?

• El siglo 21 es hoy •

Play Episode Listen Later Feb 5, 2018 16:33


Procrastinar viendo vídeos en YouTube es algo familiar ya para gran parte de la población mundial. Y también algo cuestionable tanto a nivel personal como de sistema social. ¿Cuánto tiempo perdemos? Y quién gana con eso? YouTube? Nos manipula? Y más importante aún ¿esto nos llevará a buen puerto o nos echará a perder como sociedad. Hablemos del algoritmo para intentar retenernos frente a la pantalla y de las críticas que Guillaume Chaslot, un ex-programador de YouTube ha revelado. Según algunas noticias Alphabet (nombre de la empresa matriz de Google y de YouTube) habría tenido pérdidas de 3 mil millones de dólares en el último trimestre de 2017. Y eso, me aventuro yo a interpretar, podría estar conectado con contabilidad de impuestos y manejos de acciones en bolsa, pero no debemos olvidar los ingresos por publicidad en el buscador; y quizás también nos de pistas sobre lo que sucede con YouTube. Noticia en Canaltech:https://canaltech.com.br/resultados-financeiros/google-registra-perda-liquida-de-us-3-bilhoes-no-quarto-trimestre-de-2017-107772/Noticia en Swissinfo:https://www.swissinfo.ch/spa/p%C3%A9rdida-neta-de-usd-3.000-millones-en-4%C2%BA-trimestre-para--alphabet--matriz-de-google/43870448Noticia en Gembeta:https://www.genbeta.com/redes-sociales-y-comunidades/el-algoritmo-tiene-sesgos-perturbadores-y-peligrosos-un-ex-ingeniero-de-youtube-critica-como-se-fabrican-los-videos-recomendados