Podcasts about Human Rights Data Analysis Group

  • 12PODCASTS
  • 14EPISODES
  • 38mAVG DURATION
  • ?INFREQUENT EPISODES
  • May 12, 2025LATEST
Human Rights Data Analysis Group

POPULARITY

20172018201920202021202220232024


Best podcasts about Human Rights Data Analysis Group

Latest podcast episodes about Human Rights Data Analysis Group

Keen On Democracy
Episode 2531: Emily Bender and Alex Hanna on the AI Con

Keen On Democracy

Play Episode Listen Later May 12, 2025 43:12


Is AI a big scam? In their co-authored new book, The AI Con, Emily Bender and Alex Hanna take aim at what they call big tech “hype”. They argue that large language models from OpenAI or Anthropic are merely what Bender dubs "stochastic parrots" that produce text without the human understanding nor the revolutionary technology that these companies claim. Both Bender, a professor of linguistics, and Hanna, a former AI researcher at Google, challenge the notion that AI will replace human workers, suggesting instead that these algorithms produce "mid" or "janky" content lacking human insight. They accuse tech companies of hyping fear of missing out (FOMO) to drive adoption. Instead of centralized AI controlled by corporations, they advocate for community-controlled technology that empowers users rather than exploiting them. Five Takeaways (with a little help from Claude)* Large language models are "stochastic parrots" that produce text based on probability distributions from training data without actual understanding or communicative intent.* The AI "revolution" is primarily driven by marketing and hype rather than groundbreaking technological innovations, creating fear of missing out (FOMO) to drive adoption.* AI companies are positioning their products as "general purpose technologies" like electricity, but LLMs lack the reliability and functionality to justify this comparison.* Corporate AI is designed to replace human labor and centralize power, which the authors see as an inherently political project with concerning implications.* Bender and Hanna advocate for community-controlled technology development where people have agency over the tools they use, citing examples like Teheku Media's language technology for Maori communities.Dr. Emily M. Bender is a Professor of Linguistics at the University of Washington where she is also the Faculty Director of the Computational Linguistics Master of Science program and affiliate faculty in the School of Computer Science and Engineering and the Information School. In 2023, she was included in the inaugural Time 100 list of the most influential people in AI. She is frequently consulted by policymakers, from municipal officials to the federal government to the United Nations, for insight into into how to understand so-called AI technologies.Dr. Alex Hanna is Director of Research at the Distributed AI Research Institute (DAIR). A sociologist by training, her work centers on the data used in new computational technologies, and the ways in which these data exacerbate racial, gender, and class inequality. She also works in the area of social movements, focusing on the dynamics of anti-racist campus protest in the US and Canada. She holds a BS in Computer Science and Mathematics and a BA in Sociology from Purdue University, and an MS and a PhD in Sociology from the University of Wisconsin-Madison. Dr. Hanna is the co-author of The AI Con (Harper, 2025), a book about AI and the hype around it. With Emily M. Bender, she also runs the Mystery AI Hype Theater 3000 series, playfully and wickedly tearing apart AI hype for a live audience online on Twitch and her podcast. She has published widely in top-tier venues across the social sciences, including the journals Mobilization, American Behavioral Scientist, and Big Data & Society, and top-tier computer science conferences such as CSCW, FAccT, and NeurIPS. Dr. Hanna serves as a Senior Fellow at the Center for Applied Transgender Studies and sits on the advisory board for the Human Rights Data Analysis Group. She is also recipient of the Wisconsin Alumni Association's Forward Award, has been included on FastCompany's Queer 50 (2021, 2024) List and Business Insider's AI Power List, and has been featured in the Cal Academy of Sciences New Science exhibit, which highlights queer and trans scientists of color.Named as one of the "100 most connected men" by GQ magazine, Andrew Keen is amongst the world's best known broadcasters and commentators. In addition to presenting the daily KEEN ON show, he is the host of the long-running How To Fix Democracy interview series. He is also the author of four prescient books about digital technology: CULT OF THE AMATEUR, DIGITAL VERTIGO, THE INTERNET IS NOT THE ANSWER and HOW TO FIX THE FUTURE. Andrew lives in San Francisco, is married to Cassandra Knight, Google's VP of Litigation & Discovery, and has two grown children.Keen On America is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber. This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit keenon.substack.com/subscribe

THE ONE'S CHANGING THE WORLD -PODCAST
UNVEILING INEQUALITIES ON DATA, TECH & SOCIAL JUSTICE -ALEX HANNA: DISTRIBUTED AI RESEARCH INSTITUTE

THE ONE'S CHANGING THE WORLD -PODCAST

Play Episode Listen Later Oct 2, 2023 30:17


#artificialintelligence #decentralization #aiforgood Dr. Alex Hanna is Director of Research at the Distributed AI Research Institute (DAIR). A sociologist by training, her work centers on the data used in new computational technologies, and the ways in which these data exacerbate racial, gender, and class inequality. She also works in the area of social movements, focusing on the dynamics of anti-racist campus protest in the US and Canada. Dr. Hanna has published widely in top-tier venues across the social sciences, including the journals Mobilization, American Behavioral Scientist, and Big Data & Society, and top-tier computer science conferences such as CSCW, FAccT, and NeurIPS. Dr. Hanna serves as a co-chair of Sociologists for Trans Justice, as a Senior Fellow at the Center for Applied Transgender Studies, and sits on the advisory board for the Human Rights Data Analysis Group and the Scholars Council for the UCLA Center for Critical Internet Inquiry. FastCompany included Dr. Hanna as part of their 2021 Queer 50, and she has been featured in the Cal Academy of Sciences New Science exhibit, which highlights queer and trans scientists of color. She holds a BS in Computer Science and Mathematics and a BA in Sociology from Purdue University, and an MS and a PhD in Sociology from the University of Wisconsin-Madison. https://www.linkedin.com/in/alex-hanna-ph-d https://alex-hanna.com/ https://twitter.com/alexhanna https://www.dair-institute.org/

Stats + Stories
The Dark Statistical Story of the World Cup | Stats + Stories Episode 295

Stats + Stories

Play Episode Listen Later Sep 14, 2023 30:19


Women's World Cup action in Austrailia and New Zealand has wrapped up and Spain's been crowned the champion. After players and fans headed home, residents were left to clean up after them. Hosts of such tournaments are also left to tackle the human rights implications of hosting an event that massive. The human rights impacts of something like the World Cup are incredibly hard to measure and that is the focus of this episode of Stats+Stories with guest Dr. Megan Price. Dr. Megan Price is the Executive Director of the Human Rights Data Analysis Group, Price designs strategies and methods for statistical analysis of human rights data for projects in a variety of locations including Guatemala, Colombia, and Syria. Her work in Guatemala includes serving as the lead statistician on a project in which she analyzed documents from the National Police Archive; she has also contributed analyses submitted as evidence in two court cases in Guatemala. Her work in Syria includes serving as the lead statistician and author on three reports, commissioned by the Office of the United Nations High Commissioner of Human Rights (OHCHR), on documented deaths in that country.

The Rights Track
Eyewitness: using digital technology to prosecute human rights abusers

The Rights Track

Play Episode Listen Later Jun 28, 2022 29:17


In epiosde 8 of Series 7 of The Rights Track, Todd is in conversation with Wendy Betts, Director of eyeWitness, an International Bar Association project launched in 2015 which collects verifiable video of human rights violations for use in investigations and trials. We're asking Wendy how the use of digital technology can help to hold accountable those who commit human rights crimes.   Transcript Todd Landman  0:01  Welcome to The Rights Track podcast, which gets the hard facts about the human rights challenges facing us today. In series seven, we're discussing human rights in a digital world. I'm Todd Landman, in this episode, I'm delighted to be joined by Wendy Betts. Wendy is director of eyeWitness an International Bar Association project launched in 2015, which collects verifiable video of human rights violations for use in investigations and trials. So today we're asking Wendy, how does the use of digital technology help to hold accountable those who commit human rights crimes? So Wendy, it's absolutely brilliant to have you on this episode of the right track. So welcome. Wendy Betts  0:38  Thanks, Todd. It's great to be here. Todd Landman  0:40  You and I met in Bergen in Norway, we were at the Rafto Foundation awards for the Human Rights Data Analysis Group and Human Rights Data Analysis Group have featured in previous episodes on The Rights Track. And I see there is a kind of correlation, if you will, between the work of the Human Rights Data Analysis Group and the work that you do at eyeWitness. It is just that the data you're collecting is really video files and video footage. So tell us a little bit about the work that you're doing with eyeWitness. Wendy Betts  1:08  Absolutely. So at eyeWitness, we are helping human rights defenders in conflict zones and other places that are experiencing large scale human rights violations, to collect photo and video information in a way that makes it easier to authenticate. So that footage can be used in investigations and trials. So we work with human rights defenders in three ways. First, we're providing a mobile camera app that we designed to help ensure that the footage can be easily authenticated. And then we are helping to securely store that footage and maintain the chain of custody so it can eventually be used in investigations and trials. And third, we work to then take a working copy of that footage that we catalogue and tag to make it easier for investigators to identify footage that's potentially of interest to their investigations and incorporate that into those processes. Todd Landman  2:01  Well, that's a great summary of the work that you do. I recall when I was a student at Georgetown University, I worked in the Lauinger Library. And my job was to produce photographs in the pre-digital age. So this was processing rolls of film in the old cans used to kind of shake them with the chemicals and then use an enlarger and make photographs. And that was fine for special collections and photographing books. But one day, a Jesuit priest came into the library and handed me a roll of film and said I need 10 copies of each of these pictures. And they were actually photographs from the crime scene where Jesuit priests had been murdered in El Salvador. And I'm curious that when we enlarge those pictures and submitted them back to the authorities that requested them, is that kind of evidence still considered verifiable evidence? And what is it that the digital elements all of this adds to the veracity and the verifiability of evidence collected on human rights crimes? Wendy Betts  2:58  There's a long history of photo and video being used as evidence, that photo and video in its hard copy form would need to be verified to go to court. So generally speaking, the court would want to speak with the photographer, or in the absence of photographer, somebody that could help explain that that footage is indeed an accurate portrayal of that location at that time. And what digital technology has done is expand the ability of who can be the photographer to collect that potential evidence. So with the two trends of smartphones in everyone's pocket, plus the rise of social media platforms where people can share this information, you're suddenly seeing this massive proliferation of the amount of available information that could be used as evidence. But indeed, this also will need to be verified in much the same way. But the challenges to doing that are slightly different. And then the technology that we can bring to bear to do that is slightly different. Todd Landman  3:52  Yes, I understand those differences. And so there's a lot of debate today, if we take the War in Ukraine as a good example, when it first started, there was a flurry of activity on Twitter that said, don't believe everything you see on Twitter. So there of course will be manipulated images manipulated video, I see manipulated video every day, some of it you can tell straight away, it just looks awful. It looks like a video game. Somebody's saying, look, you know, Ukrainians are taking out Russian tanks. And actually you look at the tank tracks and you can see it just looks like a photoshopped superimposed image of a tank running over some really bad terrain, to the fully verifiable accounts that we are seeing coming out of that conflict. So how are things verified? How does one dismiss imagery in one instance and accept imagery in another? What's the expertise required to give that verifiable account? Wendy Betts  4:43  I think when you're looking at verification, what you really want to know is whether that footage was taken where and when it was claimed. And if that footage has been edited, or as you note in your examples has it been photoshopped to look like something else? And then is it possible that even if it was authentic to begin with, and I accurate to begin with hasn't been changed somewhere along the way? So has it been taken down off social media and changed and reposted? And there's been two trends that have developed to address how we can do this. So one is the plethora of open source investigation techniques that have developed in terms of how can you geo locate images using satellite footage and other types of technology? How can you Chrono locate, so how can you figure out when and where that footage was taken? Can you do frame by frame analyses to determine if that footage has been edited in any way? So that was one approach. And that has become increasingly professionalised. And is really coming to the fore in Ukraine. And then the other approach is the one that eyeWitness has taken where we developed a tool that can be used to hardwire that information in the point that that footage was taken. So those are called controlled capture tools, because you're basically controlling the information and controlling that footage, keeping it in a controlled environment for its entire lifespan. So you're collecting information about where and when that footage was taken, you're ensuring that footage can be edited. And you are maintaining that footage in that secure state through its lifespan. Todd Landman  6:04  So the app itself has the technology built inside it, you've actually hardwired that programmable element to the app, and it can't be tampered with. So if I download this app as a user, and I'm travelling through the world, and I want to document something, it's easy to use on a mobile device, easy to proliferate and sort of disseminate if you will out to users. And it's easy to learn by those users. Because the technology itself has been created in a way that preserves the identity and the verifiability of the images that are captured. Wendy Betts  6:39  That's exactly it. The eyeWitness app is designed to be really easy to use to pick up and take and start using and on the surface for the user interface. It's much like standard mobile camera, so you have to open the app instead of your camera. But you're recording footage in the same way, you can enter the secure gallery where the footage is stored to see what you've taken. And you upload it to eyeWitness, this is how we maintain the chain of custody and secure that footage until it can be used. And then you have the option to share it with your social media networks, you can attach it to a WhatsApp message, you can do a variety of things with it. All of the verification aspect is intended to happen behind the scenes kind of inside the technology. So the app is designed indeed to collect information about where and when that footage was taken from three different sources, none of which are the user themselves. It's also collecting information to ensure that that footage can't be edited. So we are calculating basically a digital fingerprint at the moment that information is captured, that stays with that footage. So if any changes wherever to be made to it, you'd be able to spot that by running the algorithm for the fingerprint again, and then that footage is stored encrypted on the device, and then it's transmitted encrypted to eyeWitness so it can't be intercepted or manipulated either at rest on the phone or in transit on its way to us. Todd Landman  8:00  So you have a secure server where all these raw files are held. Is that right? Wendy Betts  8:05  Indeed. So we've been fortunate to partner on a pro bono relationship with LexisNexis legal and professional and so they host our server in their secure hosting environment that they have for litigation services for a variety of confidential evidence that's used in cases around the world. So they host our server, which allows us to scale up quickly and scale up to meet the need. And Ukraine is a perfect example. We've received more footage from Ukraine since the invasion began, then we have globally in the last two years. So that ability to scale up quickly is very important, and more importantly, it is stored securely. So they have their state of the art security around that in a way that we couldn't necessarily put around a server if we were hosting it ourselves. Todd Landman  8:51  That's amazing. Can you tell us a little bit about the complexity of a Ukraine investigation? Let's take the case of Bucha. We know in Bucha, that there were atrocities committed of some kind, clearly there has to be an evidentiary threshold reached, there has to be a profile of perpetrators and victims, there has to be that whole disaggregation of very complex human rights events of the kinds that you and I discussed with the team from Human Rights Data Analysis Group, but what are the steps that eyeWitness takes? What's the role that you take in the preparation of, let's say, an investigation into something like the Bucha incidents that we saw? Wendy Betts  9:30  So I think if we back up to your comment earlier about just the sheer amount of footage that we've been seeing on social media, and including from places like Bucha, that I think there's a sense that there is plenty of evidence out there, and we've got everything we need. And I think what everyone needs to take step back and realise is how complex as you said these cases are. So you need information about what actually happened on the ground, what happened to these victims, and that takes the form of witness statements, it can take the form of physical evidence, it can take the form of photo and video, but we also need to know the context in which it's happening. If you want to elevate something to be a war crime, instead of a murder, you need to understand the conflict dynamic and what's happening. And then if you want to hold people at higher levels of authority responsible, and not just the people on the ground who pulled the trigger, you need to make those linkages. And that, again, is documentary evidence, it's witness evidence. So all of these pieces of this massive evidentiary puzzle have to come together. At eyeWitness, we see ourselves as one of these pieces, we are a photo video piece of evidence that can tell part of the story but has to work together with these other aspects. So we don't do full investigations ourselves and put all these pieces together, what we do is equip either civil society investigators, ordinary citizens, journalists, or others on the ground who have access to these scenes and are collecting photo and video with a tool to do it in a way that they can feed that information into investigations because it can be so easily verified, so they can contribute to this puzzle, in order to help hold the perpetrators responsible. Todd Landman  11:03  I think this whole portrayal of the contribution that you're making is really important. In our interview with the director of Human Rights State Analysis Group, Patrick Ball, the sort of data guru as it were in these areas, he said, you know, statistics are not a silver bullet. So the work that they do, would provide the statistical analysis that showed that certain things were happening that could not be explained by chance alone. But it was only ever one part of a very complex story alongside documentary evidence, alongside testimonies alongside forensic anthropology alongside many other things. And then ultimately, a determination of, let's say, genocide was a legal judgement that was either supported or not supported by the type of statistical evidence that was provided alongside other pieces of evidence. Now you're making a very similar case that whatever body is going to be prosecuting crimes, in whether it is Bucha, or the broader conflict in Ukraine, eyeWitness is only ever going to be one part of that much bigger story. Is that right? Wendy Betts  12:02  Exactly, exactly. I think all of these different strains of investigation have to work together, people collecting witness statements, the people doing open source investigation of footage and other information that was posted early on, people who have access to official documents, all of these pieces have to fit together, because as you said, in addition to showing just the baseline conduct happening on the ground, you need to show these patterns in magnitudes. And you can only do that by amassing large amounts of information that can show some of those patterns and run those types of statistical analysis that Patrick was talking about. So it all does fit together and complements each other. Todd Landman  12:42  Yeah. And you know, the conflict in Ukraine is by no means over. And you know, I read up a report, I think it was yesterday that said, there are up to 30,000 war crimes that need to be investigated. Now, each crime itself requires extensive documentation, and then you multiply that by the number of crimes. And of course, there may be future crimes committed that will need to be documented as well. So the scale of just this conflict in Ukraine, you said, you've received more images from Ukraine, and then you have in the last two years of other areas of the world, and we may get to talking about those other areas of the world. But to me, the scale of what's happening in Ukraine, and the time that's required to fully prosecute many of these crimes means that we're really going to be in this for the long haul. Wendy Betts  13:25  Justice, unfortunately, in these types of cases is definitely a long term process, and the arc of justice is quite long. And that's what we hope is part of the value added of eyeWitness and why we provide that secure storage aspect, because the photos and videos taken now may well not be involved in an investigation or a trial for years and years to come. And so we can safeguard that footage in a way that even at that time, we can hand it over and it could stand up to the scrutiny. But indeed, I think we're looking at a long term prospect for justice. Todd Landman  13:58  Yes. And outside the Ukraine context, what are some other examples of where eyeWitness has been collecting this video footage from other parts of the world? Wendy Betts  14:06  So eyeWitness launched publicly in 2015. And we really do work globally. And we respond to the inquiries and the needs of human rights defenders in various parts of the world. Now, some places we don't advertise especially where the security situation is quite serious for some of the human rights defenders using the eyeWitness app. But in other places, we have been able to be a bit more public. So we have been working actually in Ukraine since 2017. And we put out a report about shelling of civilian dwellings to the United Nations Special Rapporteur on the right to adequate housing. So that's one area where we've been active even before the current events. We've also recently submitted a report to the UN Special Rapporteur on extrajudicial killings related to violence occurring in the middle belt area of Nigeria between farmers and herders. We've also been active in the Palestine context with partners there using the eyeWitness app. So we've been quite broadly represented around the globe. And we review accountability broadly as well. And so that's why I'm mentioning non-judicial approaches to accountability. Any efforts that can get at this conduct and get it and investigate it and helped to hold the perpetrators responsible is what we're interested in empowering human rights defenders to do. Todd Landman  15:25  Okay. And do you provide training alongside because it's one thing just to download an app and start using it, but you might make sort of fundamental errors in using the technology from the start? So do you provide a training manual or workshops or online training for users as they download the app and then say, well, actually, this is the best way to film things? Or do you just sort of allow the technology to run in the hands of the users? Wendy Betts  15:49  Our preferred approach is to work in long term partnerships with human rights defenders that want to use the app, we very much see the app as a tool and to be used effectively, you do need to put more skill building and strengthening around that tool. So we do work hand in hand with human rights defenders, who plan to use the app on not only how to use the app, but how to incorporate photo and video into demonstrating whatever types of violations that are looking into, we can provide training on how you actually film when you're at the crime scene. We work with a lot of human rights defenders whose primary efforts have been advocacy oriented, and those are very different photos than photos you want to take for evidence. And so we work to help them make that shift as well. And so then we give them ongoing feedback. Once their footage starts coming in, we can provide tech support, if they're out in the field, and we know they're going on a documentation mission, we can be ready to answer any questions if they have any. So we really want to work with them hand in hand to not just use eyeWitness but use it effectively. Todd Landman  16:54  I understand and does the technology work in the absence of a mobile signal in the absence of a WiFi connection? Can you collect videos on a phone, outside of network, and then when it gets back into the network, you're able to upload the images and videos that have been taken to a secure server? Wendy Betts  17:11  Our goal in designing eyeWitness is to make sure that it can work in the types of environments where these human rights defenders are active. And especially when you look at conflict zones where electricity may be disrupted, internet may be disrupted, cell service may be disrupted. So the app is designed to be able to collect, not only take the photos and videos, but all of the metadata that's needed to help verify where and when it was taken while offline. So you don't need to have access to the internet. Nor do you need to have a cell subscription or any other kind of data service that will collect all of that. It's designed to store that information securely in a gallery separate from the gallery on your phone. So it's hidden in a secure gallery. The idea being that these human rights defenders may have to make their way back to their headquarters or make their way back to someplace with internet before they're able to upload it to us and then delete it off their phones. So we wanted it to remain hidden in transit during that timeframe. So it is definitely aimed at helping individuals in contexts where there's high security risks, infrastructure challenges to be able to use the app. Todd Landman  18:17  You've definitely given that a lot of thought, I guess another question that flows from that is what's the minimum viable technical requirement on a phone? Obviously, it needs to be a smartphone with a camera and a video. But how far back in time can you go in terms of the age of a device because of the availability of resources, etc in some of these conflict zones? What sort of phone is the basic unit you require to use the app? Wendy Betts  18:39  That is a really good question, because it's such an important issue in terms of access and availability of these tools to the vulnerable segments of society that need them most. First thing I should say it's designed for Android, and we don't currently have an iOS version. Part of that is because the demographics of the places where we're working is primarily Android users. So it's designed for that operating system. And we've designed it to go back to android 6.0, which I think is roughly operating systems on phones back to 2015. So it does stretch back a fair way, we made a decision not to go back any further. And that's because Android changed how it handles security at the 6.0 version onward. And we could harden the security of the information both to protect the user and the integrity of the information from that version onward in a way that was more difficult in previous versions. So that's when it goes back to Todd Landman  19:32  And are there any plans to make this available in iOS? Or are there sort of limitations in terms of partnering with Apple to make that happen? Wendy Betts  19:40  We regularly revisit the question and we're actually currently in the process of again, looking at if we could replicate all the functionality that we currently offer security, the anonymity those types of questions, in an iOS version and then looking at the cost compared to the potential user base are the calculations we make. So we're looking at that right now again actually. Todd Landman  20:00  But for the user, this is free. It's an app that you download for free and then use. Is that right? Wendy Betts  20:05  Exactly. It's free. That's freely available. As I said, we would like to work in partnerships. But that's not necessary. Any individual can go to the Google Play Store now and download it and start using it. We do have written instruction guides on our website, we have a short video on how to use it and some other resources that are available. Todd Landman  20:25  Great. And then I guess my final set of questions really is about how this evidence connects to what we've say different photographic evidence you made passing reference to the use of satellite imagery, which has been a very powerful tool, I think the company planet takes a picture of the entire surface of the Earth every 24 hours with its sort of flocks of satellites, then they have the system, if one satellite goes down, they can easily replace another one within the flock. And they have tremendous number of images that are very high resolution, and I should say an increasing resolution. But that's one version of what you can see from space, as it were. And what you're saying is in the hands of users and defenders, you have almost a citizen science ground truthing that can take place as well. Are there any efforts to coordinate between your organisation and some of these providers of satellite imagery, if asked to do so? You mentioned the forced deportations or the destruction of houses. The Special Rapporteur on adequate housing, for example. So you could see satellites, just, you know, images before and after a village is destroyed. But equally, you could triangulate that with your users on the ground, saying, Here's a house being destroyed, I'm hiding in a bush filming this right now is that sort of partnership and, you know, sort of holistic approach being developed in your organisation? Wendy Betts  21:38  So we have certainly used satellite footage in some of our analyses in that Ukraine one about shelling is a key example. That case we didn't establish a partnership, we used what was publicly available that we can access to help go back and look at the dates and locations of the photos we have and then go back and look at satellite footage. And we use that primarily to determine when the attack actually took place. So we have photos dated as to when they were taken. But that doesn't necessarily give you the date of when the attack was. So we use satellite footage a lot to help determine Okay, well, this building looked intact on this date, and certainly looks more like the photo that we have on this date. And then that way, we were able to determine at what point the attack probably took place. We've also worked with another organisation that was doing an investigation of environmental damage in a different location. And in that case, they were able to get the latitude and longitude of the event that they were looking at using the app. And then they were able to get historic and current satellite footage for that location to be able to trace the trends of what was happening there. So they were looking at some environmental damage. So you can help see the change in the environment based on what you're seeing in the satellite photos. That being said, there's certainly the ability to work with satellite providers to help target so I think if you're setting out to do an investigation, and you know, you're going to be in certain places at a certain time and you need some of those satellites pointed at those locations. I think those type of partnerships are indeed possible. We haven't engaged with any of those at the moment because again, we tend to be led by our human right defender users and what they want to investigate. But I think there are organisations that are engaging in those types of partnerships. Todd Landman  23:19  That's great. That's very clear in your explanation. And then I suppose a follow up question would be you've been an operation now since 2015, you've had seven years of footage coming in and the secure servers and you've supplied images to cases, can you tell us the story of success, you know, it has been successful prosecutions in your mind from a legal perspective where you think that eyeWitness has made a definitive contribution to the outcome of those cases? Wendy Betts  23:44  Absolutely. So as I mentioned, we've launched the app in 2015. And we're looking at atrocity crime. So going back to your earlier point about the long arc of justice in these crimes, we have to kind of bear that in mind at what point they might actually go to trial. So we've provided a significant amount of information to investigations at different stages of the process. And so not all of those have gone to trial yet. But indeed, we did collaborate on a case that has gone to trial and resulted in a conviction. And this was a project that we did in partnership with WITNESS, which is a group based in Brooklyn, and with TRIAL International, which is an organisation based in Switzerland, who does strategic litigation. And we then all three of us partnered with a human rights defender group on the ground in the Democratic Republic of Congo. And in that case, they were investigating a massacre that took place in 2012 in two different villages in eastern DRC. And local human rights defenders were able to use the app based on training that they received from WITNESS on filming a crime scene, and to help put it into a case that TRIAL International was helping to build and they were able to use the app to go back and collect photos that helped to actually authenticate footage that was taken contemporaneously with the massacre. But that hadn't been stored in a way to protect the chain of custody. So they were able to go back and take footage of some of the same scarring injuries on the victims to demonstrate that the ones taken at the time, were accurate, and able to take photos of the mass grave, which could be used to help determine the number of bodies based on its dimensions and how that matched up with the reports of the number of people who had been killed, and with the photos that have been taken at the time of the burials. So all of this footage was entered into evidence by the prosecutor in the case, and was accepted by the court and was noted in the judgement about the power of the footage. And indeed, the two militia leaders were convicted of crimes against humanity. Todd Landman  25:51  Right. So that's a real success story, I had the pleasure of visiting WITNESS in Brooklyn back in 2011. And I recall that it's funny when you enter their offices, they sort of have a timeline of tech sitting in their front room, you know, cameras from ages ago, up to the latest stuff, and they're very, very good at training people how to represent human rights in a slightly different way that you do it. But working together, obviously has produced a great benefit. Now, it's that timeline of tech that interests me. And my final question is that, you know, technology continues to advance at an exponential rate. And what do you see for the future in this space? What would you like to do that you can't do yet, but you think will be possible in a few years time with respect to the technology that you've been working with and developing? Wendy Betts  26:32  That's a great question. There's so many exciting things that can happen with technology. I mean, there's already it's not even in the future, it's looking at virtual reality and using that for juries to kind of put them in the place of the crime scene. And that's all based on taking a number of photos and videos that can then be put through the algorithm to be transformed into virtual reality. There's the idea of being able to take 3D photo and video that you might be able to broadcast into the courtroom. I think the interesting component of that, though, is can the courts keep up? I think the courts now are trying to determine how to best handle digital evidence that's coming out of this flood of footage over the last 10 years. So I'm not sure we're ready to start talking about how we handle 3D images that are captured on a mobile phone. Todd Landman  27:20  Yeah, absolutely. And, you know, like DNA suddenly emerged as a new thing that, you know, transformed the legal profession in terms of solid evidence about whether somebody was actually present at a crime scene, and you could re litigate cases for many years ago, you've put your finger on that challenge between the advance of technology and the ability for legal entities to keep up and courts to keep up there have to be determinations around what is an acceptable piece of evidence. And that's a very interesting challenge for the future. But you've given us so much to think about here. And I think there is this fear of technology as a fear of manipulation of images. There's also the fear of cracking an encrypted storage of these images. But you have given us assurances and confidence in the technology that you developed the way that you've partnered with organisations to help you store this information. And then, of course, this chain of custody, the chain of evidence which is unbroken, and the ways in which these images really do contribute to, as you call it, the long arc of justice. So it's a very interesting conclusion to reach, at least at this stage, in listening to you and talking about how this form of technology which is in the palm of our hands, gives us the power in the palm of our hands to defend human rights in such interesting ways. And in my view, shows us that the digital transformation and technological advance we're seeing in the world can make a positive contribution to positive social change. So Wendy Betts has just leaves me to thank you very, very much for sharing your thoughts with us today on The Rights Track. Wendy Betts  28:45  Thanks so much for having me. It was a great conversation. Christine Garrington  28:49  Thanks for listening to this episode of The Rights Track podcast, which was presented by Todd Landman and produced by Chris Garrington of Research Podcasts with funding from 3di. You can find a full transcript of this episode on the website at www.RightsTrack.org together with useful links to content mentioned in the discussion. Don't forget to subscribe wherever you listen to your podcasts to access future and earlier episodes.   Additional Links: eyeWitness app  

Stats + Stories
The State of Human Rights in the Pandemic | Stats + Stories Episode 151

Stats + Stories

Play Episode Listen Later Aug 13, 2020 27:51


Almost every day we seem to get new data about the COVID crisis. Whether it’s infection rates, death rates, testing rates, false-negative rates, there’s a lot of information to cull through. Making sense of COVID data is the focus of this episode of Stats and Stories with Megan Price and Maria Gargiulo. Megan Price is the Executive Director of the Human Rights Data Analysis Group, Price designs strategies and methods for statistical analysis of human rights data for projects in a variety of locations including Guatemala, Colombia, and Syria. Her work in Guatemala includes serving as the lead statistician on a project in which she analyzed documents from the National Police Archive; she has also contributed analyses submitted as evidence in two court cases in Guatemala. Her work in Syria includes serving as the lead statistician and author on three reports, commissioned by the Office of the United Nations High Commissioner of Human Rights (OHCHR), on documented deaths in that country. @StatMegan Maria Gargiulo is a statistician at the Human Rights Data Analysis Group. She has conducted field research on intimate partner violence in Nicaragua and was a Civic Digital Fellow at the United States Census Bureau. She holds a B.S. in statistics and data science and Spanish literature from Yale University. She is also an avid tea drinker. You can find her on Twitter @thegargiulian. Timestamps 2:55 What’s the reaction been? 11:10 How important is the information in supporting these decisions. 14:30 What stories are we missing? 18:14 Schools and Covid 23:30 How to Make Sense of all of the COVID data

WashingTECH Tech Policy Podcast with Joe Miller
‘How Data Mapping Can Save Moms’ Lives’ with Licy Do Canto (Ep. 218)

WashingTECH Tech Policy Podcast with Joe Miller

Play Episode Listen Later Feb 3, 2020 17:58


‘How Data Mapping Can Save Moms’ Lives’ with Licy Do Canto (Ep. 218) Bio As Managing Director of BCW Healthcare in the firm’s Public Affairs and Crisis practice, Licy Do Canto (@LicyMD) leads policy and public affairs strategy for the firm’s healthcare clients in North America across public and corporate affairs, government relations, communications and reputation management on a diverse and broad range of healthcare issues. He also oversees the BCW Healthcare Team in Washington, D.C.  An expert in health and healthcare policy, with twenty five years of experience at the national, state and local levels across the nonprofit, philanthropic, corporate and government sectors, Licy is an accomplished, values-driven leader with unparalleled experience in developing and leading integrated public affairs campaigns combining strategic communications, public relations, political and legislative initiatives, policy, coalition building, grassroots/grasstops efforts and direct advocacy.  Before joining BCW, Licy built and lead a nationally recognized minority owned strategic public affairs and communications firm, served as Health Practice Chair and Principal at The Raben Group, was the Chief Executive Officer of The AIDS Alliance for Children, Youth and Families, and managed and helped set the leadership direction for strategic policy, communications and advocacy investments in executive and senior government affairs roles for the American Cancer Society and the nation’s Community Health Centers.  Before joining the private sector, Licy served as health policy advisor to U.S. Rep. Barney Frank and served in several stints in the Office of the late Sen. Edward M. Kennedy. During his extensive tenure in Washington, D.C., Licy has played a leading role in efforts to draft, shape and enact many pieces of legislation and policy affecting public health, health care safety net and the U.S. health care system.  Licy is a graduate of Duke University and holds a certificate in public health leadership from the University of North Chapel Hill—School of Public Health and Kenan Flagler Business School, and is the recipient of multiple industry awards and citations for his leadership, policy and public affairs acumen, including being named to The Hill Newspaper list of most influential leaders in Washington, D.C. consecutively over the last ten years. Resources BCW Global S.3152 – Data Mapping to Save Moms’ Lives Act News Roundup Zuckerberg says new content policies will ‘piss off a lot of people’  Facebook CEO Mark Zuckerberg continues his crusade to be the standard-bearer of free speech even if his company’s policies “piss off a lot of people”. Zuckerberg told CNN that he plans to draw a line in the sand when it comes to censorship as he thinks Facebook is taking on too much of that responsibility. He says that the company will continue to remove the most harmful content and also discussed plans to ramp up encryption on Facebook’s messaging service. Zuckerberg has remained steadfast in maintaining Facebook’s policy of leaving up false statements by politicians in their ads. Study: Using pre-trial risk assessment tools to book criminal defendants increases the likelihood they’ll considered a flight risk Courts routinely use pre-trial assessment tools to determine the likelihood that a defendant will flee if they’re released on bail. The higher the flight risk, the more pre-trial supervision the court will impose. The data these tools rely on includes data on past arrests.  But a new study from the Human Rights Data Analysis Group and San Francisco Public Defenders Office notes that many of those arrests lead to acquittals. But despite the acquittals, the study found, courts recommended a higher level of pre-trial supervision in 27% of cases that include prior arrest data in their pre-trial assessment tools. ACLU: Puerto Rico’s online voting plan is too risky The American Civil Liberties Union is pushing back against legislation in Puerto Rico that aims to bring voting fully online by 2028. The ACLU is asking Puerto Rico’s Governor Wanda Vázquez to veto the bill after it passes the Legislative Assembly of Puerto Rico, which is expected to happen this week. Lawyers for the century-old advocacy organization argue that the plan is extremely susceptible to hacks and poses significant cybersecurity risks that threaten to undermine Puerto Ricans’ trust in the government. House Oversight Committee seeks answers from dating apps on kids’ privacy The House Oversight Committee’s Subcommittee on Economic and Consumer Policy launched an investigation into dating platforms’ failure to prevent underage users from signing up. Members of the subcommittee wrote Bumble, Grindr, The Meet Group, the Match Group, Tinder, and OkCupid seeking documents pertaining to any policies they have in place to prevent underage users pretending that they’re over 18 and sex offenders from lurking on the platforms. The documents are due to the subcommittee on February 13. Health records app pushed opioids Bloomberg reports that in the midst of the opioid crisis, between 2016 and 2019, electronic health records company Practice Fusion pushed alerts encouraging opioid treatment on 230 million separate occasions.  A Vermont federal court says the company has agreed to pay $145 million in civil and criminal damages.

Heinz Radio
Applying Data Analysis to Human Rights Investigations with Patrick Ball

Heinz Radio

Play Episode Listen Later Nov 4, 2019 24:58


Thorough data analysis is crucial to human rights investigations – but collecting and understanding these data can provide an incredible challenge. In this week’s episode, Patrick Ball, Director of Research at Human Rights Data Analysis Group, joins us to discuss his experience with these challenges and the importance of applying rigorous science to human rights investigations. Patrick Ball, PhD, has spent the last 25 years conducting quantitative analysis for truth commissions, non-governmental organizations, international criminal tribunals, and United Nations missions in El Salvador, Ethiopia, Guatemala, Haiti, South Africa, Chad, Sri Lanka, East Timor, Sierra Leone, South Africa, Kosovo, Liberia, Perú, Colombia, the Democratic Republic of Congo, and Syria. Patrick has provided expert testimony in several trials, including those of Slobodan Milošević, the former President of Serbia; José Efraín Ríos Montt, former de-facto president of Guatemala; and Hissène Habré, the former President of Chad.

Off The Grid
Off The Grid - Syria’s Slaughterhouses

Off The Grid

Play Episode Listen Later May 23, 2019 24:20


The AIBs (Association of International Broadcasting) highly commends Syria’s Slaughterhouses documentary for its rare access to men and women who survived years of torture inside Syria’s prisons. The recognition was given in London, UK in November 2018. “Syria’s slaughterhouses” provides a rare insight into Bashar Al-Assad’s detention system. Including the account of a former guard of Saydnaya, the most infamous prison in Syria. Prisoners are sent there to die and have to endure constant torture and inhuman treatment. This network of prisons has been running for decades and is often described as a state killing machine. According to the Human Rights Data Analysis Group, almost 18 000 people were killed in government custody between March 2011 and December 2015, an average of 300 deaths each month. Detainees not only endure torture, but are also forced to commit crimes. Women are not spared and many had to undergo surgery after constant beating and repeated rape. Off the Grid, is an award-winning, character-driven documentary series which tells compelling and in-depth stories from around the world. Director and producer: Mouhssine Ennaimi Executive Producer: Alexandra Pauliat Pictures: Ensar Arvas Hatay Producer: Sena Baran Editor & colourist: Oguz Atabas, Deniz Salmanli Motion Graphics: Zlatan Nezirovic, Selim Durak, Selim Buyukguner Narrator: Adnan Nawaz #OffTheGrid #Syria #slaughterhouses

Women in Data Science
Megan Price | Data Science and the Fight for Human Rights

Women in Data Science

Play Episode Listen Later Nov 20, 2018 46:23


Data scientists are involved in a wide array of domains, everything from healthcare to cybersecurity to cosmology. Megan Price and her colleagues at the Human Rights Data Analysis Group (HRDAG), however, are using data science to help bring human rights abusers to justice. The nonpartisan group played a key role in the case of Edgar Fernando García, a 26-year-old engineering student and labor activist who disappeared during Guatemala’s brutal civil war. Price, the executive director of HRDAG, says the investigation took years, but their work led to the conviction of two officers who kidnapped Garcia and the former police chief who bore command responsibility for the crime. “It was one of the most satisfying projects that I’ve worked on,” she says. Price discussed the case in more detail as well as other cases she’s worked on over the years and the role data science played in an interview recorded for the Women in Data Science podcast recorded at Stanford University. For a recent project in Syria, Price’s group used statistical modeling and found information previously unobserved by local groups tracking the damage caused by the war. Similarly, in Mexico, she expects HRDAG to gain a better understanding of in-country violence by building a machine learning model to predict counties with a higher probability of undiscovered graves. Price hopes that in the future human rights and advocacy organizations will have their own in-house data scientists to further combat social injustices around the world, and she believes that data science will continue to play an important role in the field. She advises young people entering the field of data science and social change to learn a programming language, pick an editor and find mentors and cheerleaders to help them along the way.

DataTalk
Using Statistics to Advance Social Justice & Human Rights w/ Dr. Megan Price

DataTalk

Play Episode Listen Later Aug 13, 2018 32:04


In this DataTalk, we chat with Dr. Megan Price, Executive Director of Human Rights Data Analysis Group, about ways to use data to promote social justice and human rights around the world. Megan earned her PhD in Biostatistics and a Certificate in Human Rights from the Rollins School of Public Health at Emory University. She also holds a master of science degree and bachelor of science degree in Statistics from Case Western Reserve University.

Stanford Social Innovation Review Podcast
Prediction vs. Bias in Data: A Debate

Stanford Social Innovation Review Podcast

Play Episode Listen Later May 29, 2017 13:07


This panel from our Do Good Data | Data on Purpose conference features conference co-hosts Lucy Bernholz of Stanford PACS and Andrew Means of Uptake, along with Stanford education professor Candace Thille, and Kristian Lum, lead statistician at the Human Rights Data Analysis Group. The discussion focuses on the advantages and drawbacks of using data to analyze social trends in areas including higher education and criminal justice. View the slides from this presentation here. https://ssir.org/podcasts/entry/prediction_vs._bias_in_data_a_debate

data debate predictions stanford bias uptake human rights data analysis group andrew means
The Rights Track
How can statistics advance human rights?

The Rights Track

Play Episode Listen Later Jun 16, 2016 30:46


In Episode 7 of The Rights Track, Todd asks Patrick Ball, Director of Research for the Human Rights Data Analysis Group, how and when statistics can be used to advance and protect human rights. Here are some notes from the interview including useful links and some additional resources from our partner, openGlobalRights 0.00-8.55 mins Todd outlines Patrick's work testifying against Slobodan Milosevic, on numerous truth commissions and the evidence he provided at the trial of General Ríos Montt in Guatemala. How statistics can be used in general to advance human rights by showing patterns rather than specific individual cases What Patrick means by ‘making the violation the unit of analysis' using example from El Salvador and how this approach can help in efforts to hold individuals and countries to account and how it prevents us from missing key information and complexities Todd talks about parallels with Chile 8.55-17.20 mins Patrick talks about where he gets his data and information from: voices/testimonies of survivors and victims of mass violence are collected and then coded for quantitative analysis How this is defined by the governing legal structure i.e. Universal Declaration of Human Rights and the International Covenant on Civil and Political Rightsor local and domestic law Why the only truly reliable witness account is of killing/deaths and why this and violence are what Patrick and HRDAG focus on these in their research The key elements in a testimony that are required for the a violation to be counted and to tackle the problem of people fabricating evidence 17.20-23.32 How statistical modelling can be used to account for unreported deaths Example of how in the Peruvian Truth and Reconciliation Commission, Patrick and his team came up with a figure of 69,000 deaths from 17,000 statements on human rights abuses between 1980 and 2000 How concern about underreporting deaths during Apartheid in South Africa prompted Patrick and his team to develop methods to account for missing accounts Explanation of Multiple Systems Estimation and how Patrick employs it to estimate actual deaths and take account of the multiple views presented in data collection How this method helps to close the gap between what we think we know and what is likely to be the truth of the matter - how that can reveal the scale of the issue Explanation of how people in Lima reacted negatively to the statistics because they didn't have a representative view of what had happened in war torn rural regions Anecdotal accounts versus statistical accounts - how statistical accounts can help us check our preconceptions 23.32-28.30 mins How the law community criticises use of statistics for failing to show ‘intentionality' Patrick explains how statistics are just one piece of the evidence when a case is being built and made in a court of law and the other information that is required to make that case The difference between proof and evidence and taking account of this in the process of showing human rights abuses and holding individuals, groups, governments and countries to account How statistics are one part of the puzzle of who did what to whom and why they are not a silver bullet 28.30-end New work by HRDAG on killings in Syria - challenges of collecting data but how NGOs have been collecting what Patrick believes to be reliable accounts of deaths Other useful links How many Peruvians have died? Related content on openGlobalRights Violence data: what practitioners need to know Quantitative data in human rights: what do the numbers really mean? Finding balance: evaluating human rights work in complex environments

Open Society Foundations Podcast
Digital Echoes: Understanding Patterns of Mass Violence with Data and Statistics

Open Society Foundations Podcast

Play Episode Listen Later Jun 23, 2015 106:17


Patrick Ball of the Human Rights Data Analysis Group presents real-world cases of how accurately understanding patterns of violence is increasingly important to advancing rights and justice. Speakers: Patrick Ball, Elizabeth Eagen. (Recorded: May 28, 2015)

You Are Not So Smart
022 - Survivorship Bias - Megan Price

You Are Not So Smart

Play Episode Listen Later Apr 24, 2014 76:55


The problem with sorting out failures and successes is that failures are often muted, destroyed, or somehow removed from view while successes are left behind, weighting your decisions and perceptions, tilting your view of the world. That means to be successful you must learn how to seek out what is missing. You must learn what not to do. Unfortunately, survivorship bias stands between you and the epiphanies you seek. To learn how to combat this pernicious bias, we explore the story of Abraham Wald and the Department of War Math founded during World War II, and then we interview Wald's modern-day counterpart, Megan Price, statistician and director of research at the Human Rights Data Analysis Group who explains how she uses math and statistics to save lives and improve conditions in areas of the world suffering from the effects of war. See omnystudio.com/listener for privacy information.

world war ii wald survivorship bias megan price abraham wald human rights data analysis group