POPULARITY
This is our second interview analyzing the impact of Google's decision not to deprecate third-party cookies on its Chrome browser. Daniel Jaye is a seasoned technology industry executive and currently is CEO and founder of Aqfer, a Marketing Data Platform on top of which businesses can build their own MarTech and AdTech solutions. Daniel has provided strategic, tactical and technology advisory services to a wide range of marketing technology and big data companies. Clients have included Brave Browser, Altiscale, ShareThis, Ghostery, OwnerIQ, Netezza, Akamai, and Tremor Media. He was the founder and CEO of Korrelate, a leading automotive marketing attribution company -purchased by J.D. Power in 2014- as well as the former president of TACODA -bought by AOL in 2007. Daniel was also the founder and CTO of Permissus, an enterprise privacy compliance technology provider. All of the above were preceded by his role as founder and CTO of Engage, acting CTO of CMGI and director of High Performance Computing at Fidelity Investments. He also worked at Epsilon and Accenture (formerly Andersen Consulting). Daniel Jaye graduated magna cum laude with a BA in Astronomy and Astrophysics and Physics from Harvard University. References: Daniel Jaye on LinkedIn Aqfer P3P: Platform for Privacy Preferences (W3C) Luke Mulks (Brave Browser) on Masters of Privacy Adnostic: Privacy Preserving Targeted Advertising (paper by Vincent Toubiana, Arvind Narayanan, Dan Boneh, Helen Nissenbaum, Solon Barocas)
Dr Blodgett is a senior researcher in the Fairness, Accountability, Transparency, and Ethics in AI (FATE) group at Microsoft Research Montréal. She's broadly interested in examining the social and ethical implications of natural language processing technologies; she developed approaches for anticipating, measuring, and mitigating harms arising from language technologies, focusing on the complexities of language and language technologies in their social contexts, and on supporting NLP practitioners in their ethical work. Dr. Blodgett has also worked on using NLP approaches to examine language variation and change (computational sociolinguistics), for example developing models to identify language variation on social media. Dr. Blodgett completed a Ph.D. in computer science at the University of Massachusetts Amherst working in the Statistical Social Language Analysis Lab under the guidance of Brendan O'Connor, where they were also supported by the NSF Graduate Research Fellowship. Dr. Blodgett received a B.A. in mathematics from Wellesley College. She interned at Microsoft Research New York in summer 2019, where she had the fortune of working with Solon Barocas, Hal Daumé III, and Hanna Wallach. Ready to explore the research we discuss on this week's episode? https://www.youtube.com/watch?v=lLlj2LZGfQc https://www.researchgate.net/publication/371924250_Taxonomizing_and_Measuring_Representational_Harms_A_Look_at_Image_Tagging ------------------------------------------------------- Join Kathleen and Tricia at CEESA's conference in Malta this March Learn more: https://www.ceesaconference.com/2024/ ---------------------------------------------------------- For a transcript of this episode head to https://unhingedcollaboration.com/
Machine Learning can improve decision making in a big way -- but it can also reproduce human biases and discrimination. Solon Barocas joins Vasant Dhar in episode 24 of Brave New World to discuss the challenges of solving this problem. Useful resources: 1. Solon Barocas at his website, Cornell, Google Scholar and Twitter. 2. Fairness and Machine Learning -- Solon Barocas, Moritz Hardt and Arvind Narayanan. 3. Danger Ahead: Risk Assessment and the Future of Bail Reform: John Logan Koepke and David G. Robinson. 4. Fairness and Utilization in Allocating Resources with Uncertain Demand -- Kate Donahue and Jon Kleinberg. 5. Profiles, Probabilities, and Stereotypes -- Frederick Schauer. 6. Thinking Like a Lawyer: A New Introduction to Legal Reasoning -- Frederick Schauer. 7. Measuring the predictability of life outcomes with a scientific mass collaboration -- Mathew Salganik and others. 8. Inherent Trade-Offs in the Fair Determination of Risk Scores -- Jon Kleinberg, Sendhil Mullainathan and Manish Raghavan. 9. Limits to Prediction -- Arvind Narayanan and Matthew Salganik. 10. The Fragile Families Challenge. 11. Daniel Kahneman on How Noise Hampers Judgement -- Episode 21 of Brave New World. 12. Noise: A Flaw in Human Judgment -- Daniel Kahneman. 13. Dissecting “Noise” — Vasant Dhar. 14. Nudge: Improving Decisions About Health, Wealth, and Happiness -- Richard Thaler and Cass Sunstein.
In this episode, we speak with Solon Barocas, an Assistant Professor in Information Science at Cornell University. He is currently on leave at Microsoft Research in New York. But he will be returning to Cornell to teach. We ask him about a class on ethics in data science he has designed for aspiring computer scientists. Similar classes are burgeoning all around the country. How do you best infuse a culture of ethics in the field? What more do young professionals need to assess the social impact of their work? It’s important to make sure that students know to reject certain applications of machine learning, Barocas says. But it's not enough to then expect them to go and change the world, he warns.
This Week in Machine Learning & Artificial Intelligence (AI) Podcast
Today we’re joined by Solon Barocas, Assistant Professor of Information Science at Cornell University. Solon is also the co-founder of the Fairness, Accountability, and Transparency in Machine Learning workshop that is hosted annually at conferences like ICML. Solon and I caught up to discuss his work on model interpretability and the legal and policy implications of the use of machine learning models. In our conversation, we discuss the gap between law, policy, and ML, and how to build the bridge between them, including formalizing ethical frameworks for machine learning. We also look at his paper ”The Intuitive Appeal of Explainable Machines,” which proposes that explainability is really two problems, inscrutability and non-intuitiveness, and that disentangling the two allows us to better reason about the kind of explainability that’s really needed in any given situation. The complete show notes for this episode can be found at https://twimlai.com/talk/219. And be sure to sign up for our weekly newsletter at https://twimlai.com/newsletter!
In this episode, Andrew Selbst, a Postdoctoral Scholar at Data & Society Research Institute and Visiting Fellow at the Yale Information Society Project, discusses his article "The Intuitive Appeal of Explainable Machines" (co-authored with Solon Barocas, Assistant Professor in the Department of Information Science at Cornell University), which will appear in the Fordham Law Review. Selbst begins by framing the promise and peril of algorithmic decisionmaking. Among other things, he explains how algorithmic decisionmaking works and describes the current debate over how to regulate it. In particular, he notes that many regulatory proposals focus on requiring the explanations of how an algorithm works. But he and Barocas argue that regulators should also require justifications for the construction of those algorithms, and propose some ways in which those justifications could be provided.Keywords: algorithmic accountability, explanations, law and technology, machine learning, big data, privacy, discrimination See acast.com/privacy for privacy and opt-out information.
In this episode, you can hear Matt’s recent NYC panel discussion moderated by Solon Barocas, Ph. D. (Cornell University Assistant Professor & Co-Founder of Fairness, Accountability + Transparency in Machine Learning) and featuring panelists Adam Klein (Deputy Managing Partner – Outten & Golden); Eric Dunleavy, Ph.D. (Director of Personnel Selection & Litigation Support Services – DCI Consulting) and Kelly Trindel, Ph.D. (Head of Science + Diversity Analytics). http://www.akerman.com/podcasts/disclaimer/workedup.html
This week, the results are in. Tens of thousands of people joined the Privacy Paradox challenge. And it changed you. Before the project, we asked if you knew how to get more privacy into your life—43 percent said you did. After the project, that number went up to 80 percent. Almost 90 percent of you also said this project showed you privacy invasions you didn’t know existed. When we asked you what this project made you want to do, only 7 percent of you said “give up.” Sorry guys! Don’t. Fully 70 percent of you said you want to push for protection of our digital rights. We have ideas for that in our tip sheet. A third of you said you’ll delete a social media profile. Another third said this project made you want to meditate. And just one more stat. We tallied your answers to our privacy personality quiz and gave you a personality profile. One-fifth of us were true believers in privacy before the project. Now half us are. Manoush says that includes her. In this episode, we talk through the results, and look to the future of privacy. With Michal Kosinski, creator of Apply Magic Sauce, and Solon Barocas, who studies the ethics of machine learning at Microsoft Research. Plus, reports from our listeners on the good, the bad and the ugly of their digital data. Support Note to Self by becoming a member today at NotetoSelfRadio.org/donate.
This week, the results are in. Tens of thousands of people joined the Privacy Paradox challenge. And it changed you. Before the project, we asked if you knew how to get more privacy into your life—43 percent said you did. After the project, that number went up to 80 percent. Almost 90 percent of you also said this project showed you privacy invasions you didn’t know existed. When we asked you what this project made you want to do, only 7 percent of you said “give up.” Sorry guys! Don’t. Fully 70 percent of you said you want to push for protection of our digital rights. We have ideas for that in our tip sheet. A third of you said you’ll delete a social media profile. Another third said this project made you want to meditate. And just one more stat. We tallied your answers to our privacy personality quiz and gave you a personality profile. One-fifth of us were true believers in privacy before the project. Now half us are. Manoush says that includes her. In this episode, we talk through the results, and look to the future of privacy. With Michal Kosinski, creator of Apply Magic Sauce, and Solon Barocas, who studies the ethics of machine learning at Microsoft Research. Plus, reports from our listeners on the good, the bad and the ugly of their digital data. Support Note to Self by becoming a member today at NotetoSelfRadio.org/donate.
This week, the results are in. Tens of thousands of people joined the Privacy Paradox challenge. And it changed you. Before the project, we asked if you knew how to get more privacy into your life—43 percent said you did. After the project, that number went up to 80 percent. Almost 90 percent of you also said this project showed you privacy invasions you didn’t know existed. When we asked you what this project made you want to do, only 7 percent of you said “give up.” Sorry guys! Don’t. Fully 70 percent of you said you want to push for protection of our digital rights. We have ideas for that in our tip sheet. A third of you said you’ll delete a social media profile. Another third said this project made you want to meditate. And just one more stat. We tallied your answers to our privacy personality quiz and gave you a personality profile. One-fifth of us were true believers in privacy before the project. Now half us are. Manoush says that includes her. In this episode, we talk through the results, and look to the future of privacy. With Michal Kosinski, creator of Apply Magic Sauce, and Solon Barocas, who studies the ethics of machine learning at Microsoft Research. Plus, reports from our listeners on the good, the bad and the ugly of their digital data. Support Note to Self by becoming a member today at NotetoSelfRadio.org/donate.
This week, the results are in. Tens of thousands of people joined the Privacy Paradox challenge. And it changed you. Before the project, we asked if you knew how to get more privacy into your life—43 percent said you did. After the project, that number went up to 80 percent. Almost 90 percent of you also said this project showed you privacy invasions you didn’t know existed. When we asked you what this project made you want to do, only 7 percent of you said “give up.” Sorry guys! Don’t. Fully 70 percent of you said you want to push for protection of our digital rights. We have ideas for that in our tip sheet. A third of you said you’ll delete a social media profile. Another third said this project made you want to meditate. And just one more stat. We tallied your answers to our privacy personality quiz and gave you a personality profile. One-fifth of us were true believers in privacy before the project. Now half us are. Manoush says that includes her. In this episode, we talk through the results, and look to the future of privacy. With Michal Kosinski, creator of Apply Magic Sauce, and Solon Barocas, who studies the ethics of machine learning at Microsoft Research. Plus, reports from our listeners on the good, the bad and the ugly of their digital data. Support Note to Self by becoming a member today at NotetoSelfRadio.org/donate.
This week, the results are in. Tens of thousands of people joined the Privacy Paradox challenge. And it changed you. Before the project, we asked if you knew how to get more privacy into your life—43 percent said you did. After the project, that number went up to 80 percent. Almost 90 percent of you also said this project showed you privacy invasions you didn’t know existed. When we asked you what this project made you want to do, only 7 percent of you said “give up.” Sorry guys! Don’t. Fully 70 percent of you said you want to push for protection of our digital rights. We have ideas for that in our tip sheet. A third of you said you’ll delete a social media profile. Another third said this project made you want to meditate. And just one more stat. We tallied your answers to our privacy personality quiz and gave you a personality profile. One-fifth of us were true believers in privacy before the project. Now half us are. Manoush says that includes her. In this episode, we talk through the results, and look to the future of privacy. With Michal Kosinski, creator of Apply Magic Sauce, and Solon Barocas, who studies the ethics of machine learning at Microsoft Research. Plus, reports from our listeners on the good, the bad and the ugly of their digital data. Support Note to Self by becoming a member today at NotetoSelfRadio.org/donate.
It's cold. Bed is so tempting. As is your sofa. But the siren song of your phone is calling you. According to Instagram and Facebook, every single person you know is looking gorgeous at the world's best party, eating photogenic snacks. Fear Of Missing Out. It's so real. And social media amplifies it 1000x. But maybe there's another path. Another acronym to embrace. The Joy Of Missing Out. JOMO. Caterina Fake popularized the term FOMO, with a blog post waaaay back in 2011. And her friend Anil Dash coined the term JOMO (after missing a Prince concert to attend his child’s birth). On this week's (repeat) episode of Note to Self, the two talk about the role of acronyms, the importance of thoughtful software design, and the recent history of the Internet as we know it. And if you want even more Anil Dash, he'll be talking to Manoush on January 31st at the Greene Space in New York City. We're teaming up with our friends at ProPublica for an event called Breaking the Black Box: How Algorithms Make Decisions About You. Anil, plus ProPublica’s Julia Angwin, and Microsoft Research's Solon Barocas. Come! For more Note to Self, subscribe on iTunes, Stitcher, Google Play, TuneIn, I Heart Radio, Overcast, Pocket Casts, or anywhere else using our RSS feed. Support Note to Self by becoming a member today at NotetoSelfRadio.org/donate.
It's cold. Bed is so tempting. As is your sofa. But the siren song of your phone is calling you. According to Instagram and Facebook, every single person you know is looking gorgeous at the world's best party, eating photogenic snacks. Fear Of Missing Out. It's so real. And social media amplifies it 1000x. But maybe there's another path. Another acronym to embrace. The Joy Of Missing Out. JOMO. Caterina Fake popularized the term FOMO, with a blog post waaaay back in 2011. And her friend Anil Dash coined the term JOMO (after missing a Prince concert to attend his child’s birth). On this week's (repeat) episode of Note to Self, the two talk about the role of acronyms, the importance of thoughtful software design, and the recent history of the Internet as we know it. And if you want even more Anil Dash, he'll be talking to Manoush on January 31st at the Greene Space in New York City. We're teaming up with our friends at ProPublica for an event called Breaking the Black Box: How Algorithms Make Decisions About You. Anil, plus ProPublica’s Julia Angwin, and Microsoft Research's Solon Barocas. Come! For more Note to Self, subscribe on iTunes, Stitcher, Google Play, TuneIn, I Heart Radio, Overcast, Pocket Casts, or anywhere else using our RSS feed. Support Note to Self by becoming a member today at NotetoSelfRadio.org/donate.
It's cold. Bed is so tempting. As is your sofa. But the siren song of your phone is calling you. According to Instagram and Facebook, every single person you know is looking gorgeous at the world's best party, eating photogenic snacks. Fear Of Missing Out. It's so real. And social media amplifies it 1000x. But maybe there's another path. Another acronym to embrace. The Joy Of Missing Out. JOMO. Caterina Fake popularized the term FOMO, with a blog post waaaay back in 2011. And her friend Anil Dash coined the term JOMO (after missing a Prince concert to attend his child’s birth). On this week's (repeat) episode of Note to Self, the two talk about the role of acronyms, the importance of thoughtful software design, and the recent history of the Internet as we know it. And if you want even more Anil Dash, he'll be talking to Manoush on January 31st at the Greene Space in New York City. We're teaming up with our friends at ProPublica for an event called Breaking the Black Box: How Algorithms Make Decisions About You. Anil, plus ProPublica’s Julia Angwin, and Microsoft Research's Solon Barocas. Come! For more Note to Self, subscribe on iTunes, Stitcher, Google Play, TuneIn, I Heart Radio, Overcast, Pocket Casts, or anywhere else using our RSS feed. Support Note to Self by becoming a member today at NotetoSelfRadio.org/donate.
It's cold. Bed is so tempting. As is your sofa. But the siren song of your phone is calling you. According to Instagram and Facebook, every single person you know is looking gorgeous at the world's best party, eating photogenic snacks. Fear Of Missing Out. It's so real. And social media amplifies it 1000x. But maybe there's another path. Another acronym to embrace. The Joy Of Missing Out. JOMO. Caterina Fake popularized the term FOMO, with a blog post waaaay back in 2011. And her friend Anil Dash coined the term JOMO (after missing a Prince concert to attend his child’s birth). On this week's (repeat) episode of Note to Self, the two talk about the role of acronyms, the importance of thoughtful software design, and the recent history of the Internet as we know it. And if you want even more Anil Dash, he'll be talking to Manoush on January 31st at the Greene Space in New York City. We're teaming up with our friends at ProPublica for an event called Breaking the Black Box: How Algorithms Make Decisions About You. Anil, plus ProPublica’s Julia Angwin, and Microsoft Research's Solon Barocas. Come! For more Note to Self, subscribe on iTunes, Stitcher, Google Play, TuneIn, I Heart Radio, Overcast, Pocket Casts, or anywhere else using our RSS feed. Support Note to Self by becoming a member today at NotetoSelfRadio.org/donate.
It's cold. Bed is so tempting. As is your sofa. But the siren song of your phone is calling you. According to Instagram and Facebook, every single person you know is looking gorgeous at the world's best party, eating photogenic snacks. Fear Of Missing Out. It's so real. And social media amplifies it 1000x. But maybe there's another path. Another acronym to embrace. The Joy Of Missing Out. JOMO. Caterina Fake popularized the term FOMO, with a blog post waaaay back in 2011. And her friend Anil Dash coined the term JOMO (after missing a Prince concert to attend his child’s birth). On this week's (repeat) episode of Note to Self, the two talk about the role of acronyms, the importance of thoughtful software design, and the recent history of the Internet as we know it. And if you want even more Anil Dash, he'll be talking to Manoush on January 31st at the Greene Space in New York City. We're teaming up with our friends at ProPublica for an event called Breaking the Black Box: How Algorithms Make Decisions About You. Anil, plus ProPublica’s Julia Angwin, and Microsoft Research's Solon Barocas. Come! For more Note to Self, subscribe on iTunes, Stitcher, Google Play, TuneIn, I Heart Radio, Overcast, Pocket Casts, or anywhere else using our RSS feed. Support Note to Self by becoming a member today at NotetoSelfRadio.org/donate.
Happy new year! I'm pleased to post the first show of the winter quarter, Show # 227, January 14, 2015, my interview with Solon Barocas, Postdoctoral Research Associate at Princeton's Center for Information Technology Policy, co-author of the article Big Data's Disparate Impact (with Andrew D. Selbst). Algorithmic computing and decision-making have entered our world much faster than our understanding of it. In Solon's article, he takes a close look at the massively under-explored impact of algorithms on traditional forms of employment discrimination under Title VII of the Civil Rights Act (think discrimination on the basis of race or gender). Identifying both the technical and legal issues involved is a challenge, but this article does a wonderful job exposing the risks of algorithms in this space, which often (although not exclusively) includes embedding human prejudices in the code itself. We examined these and other ramifications of algorithmic computing and civil rights discrimination in our discussion. I greatly enjoyed it (recorded at Princeton!) and hope that you find it illuminating. {Hearsay Culture is a talk show on KZSU-FM, Stanford, 90.1 FM, hosted by Center for Internet & Society Resident Fellow David S. Levine. The show includes guests and focuses on the intersection of technology and society. How is our world impacted by the great technological changes taking place? Each week, a different sphere is explored. For more information, please go to http://hearsayculture.com.}