POPULARITY
"What could the nonprofit sector do if every nonprofit had Google's engineering team?" (J. P.) In this thought-provoking episode, we sat down with Jake Porway, the visionary co-founder of DataKind, to explore the transformative potential of harnessing data science for the nonprofit sector. Jake takes us through the journey of the organization from its early days of hackathons to its evolution into a beacon for long-term, impact-oriented projects. Today, DataKind is made of many chapters worldwide, as data scientists and social workers meet to face challenges in their home countries. But we asked Jake about high-level issues as well — aspects he has put much thought into: Do nonprofits need their own data science teams? If not, what is the right model to leverage data science skills when the alternative is an incredibly high-paying job in the for-profit sector? Are organizations like DataKind suitable for generating products that scale or does their true value lie in creating a platform for much needed (and underfunded) Research and Development in the service of social impact? This episode is a must-listen for anyone who is working in tech, and is looking for a way to put their skills towards something more than ad campaign optimization. > Transcript on website RATE, WRITE, SUBSCRIBE Be sure to leave us a rating on Spotify or a review on Apple Podcasts! Wicked problems require more than one line of thought — was there anything you agreed or disagreed with? Anything you'd like us to explore further? Write us a note at podcast@techmatters.org and follow us on LinkedIn!
Dr. John Harnisher, who leads the education practice at DataKind joins host Mike Palmer in a conversation about how data science can drive social impact in education. Having known each other for over 10 years, we have an engaging conversation reflecting on John's journey in the learning field. John gives us an inside look at DataKind's mission of bringing data science and AI capabilities to social impact organizations. He shares a case study from their work with John Jay College in New York, where they built machine learning models to predict which students were at risk of not graduating. By providing this data to academic advisors, John Jay was able to improve graduation rates by 30%. We discuss the importance of keeping humans involved when implementing AI, rather than just optimizing for the technology. We also cover topics including trust in data, the hype cycle around AI, and how to measure impact. John emphasizes that we should focus AI on core problems facing education organizations, not just chase the latest "new hotness." Don't miss this chance to learn more about how data science and AI can make a social impact in education and beyond. Subscribe to Trending in Education wherever you get your podcasts. Visit us at TrendinginEd.com for more sharp takes on what's emerging in the learning universe.
As we learned from Nithya Ramanathan in Season 1, data = money = power. To understand what this means for our future, and what we can actually do about it, we're coming back strong with a whole new lineup of interviews in Season 2. You'll hear from Jake Porway, co-founder of Datakind, Yvette Alberdingk Thijm of Witness, Owen Barder of Precision Development, and many more about the importance of human-centered design, field-based learning, and the needs, challenges, and power of data for good. Tune into the biweekly Tech Matters podcast by following and subscribing on your favorite platform!
Afua Bruce is a leading public interest technologist whose work sits at the fascinating intersection of technology, policy, and society. Afua has an impressive history of serving her country and her community through various roles, demonstrating the powerful impact that technology can have when used for the public good. Her career started in the tech industry at IBM as a software engineer. She then took her talents to the FBI, using her engineering and management skills to serve her country. From there, she moved to the White House, where she directed federal interagency coordination through the National Science and Technology Council. Subsequently, Afua led the Public Interest Technology program at New America, she served as the Chief Program Officer at DataKind, and currently, Afua is the Founder and Principal at ANB Advisory Group, an organization supporting responsible technology development. Afua co-authors "The Tech That Comes Next” alongside Amy Sample Ward. This book outlines how companies can deploy technology to improve people's lives and solve social problems.
Welcome to the Trust in Tech podcast, a project by the Integrity Institute — a community driven think tank which advances the theory and practice of protecting the social internet, powered by our community of integrity professionals.In this episode, Institute member Talha Baig is in conversation with Tom Thorley of the Global Internet Forum to Counter Terrorism (GIFCT). The Forum was established to foster technical collaboration among member companies, advance relevant research, and share knowledge with smaller platforms. Tom Thorley is the Director of Technology at GIFCT and delivers cross-platform technical solutions for GIFCT members. He worked for over a decade at the British government's signals intelligence agency, GCHQ, where Tom specialized in issues at the nexus of technology and human behavior. As a passionate advocate for responsible technology, Tom is a member of the board of the SF Bay Area Internet Society Chapter; is a mentor with All Tech Is Human and Coding It Forward; and also volunteers with America On Tech and DataKind.Tom and Talha discuss the historical context behind founding GIFCT, the difficulties of cross-platform content moderation, and fighting terrorism over encrypted networks while maintaining human rights.As a reminder, the views stated in this episode are not affiliated with any organization and only represent the views of the individuals. We hope you enjoy the show.Credits:If you enjoyed today's conversation please share this episode to your friends, so we can continue making episodes like this.Today's episode was produced by Talha Baig Music by Zhao ShenSpecial thanks to Sahar, Cass, Rachel and Sean for their continued support
What does it mean to be a Daddy? How can someone embody or role-play their hot inner daddy energy? Does it always involve dominance? How does dirty talk come into the mix of daddy dynamics? What are some top tips for upping my dirty talk game? And what if I'm shy but want to be more sexually vocal and verbally express my sexual desires? "Daddy" Mischa shares all this and more. About our guest: Mischa Byruck (He/him) is a sexual integrity coach. He guides people to live their sex lives in line with their values. He is the founder of Evolve.Men, where he coaches, teaches and writes on gender and sexuality. He is an ICF-certified coach, with training in abuser counseling, facilitation, antiracism, trauma, gender, consent, men's work, and sexuality. He teaches workshops on phone sex, dirty talk, daddy kink, and is the consent education partner for Bonobo Network, the Bay Area's premiere sex-positive community organization. He's also been an erotic model and appeared in amateur porn. Previously, he founded forLove, a virtual erotic events company, ran partnerships for DataKind and Code for America, worked in international development, and started a large-scale disaster relief organization. He holds a BA from Columbia and an MPA from NYU. Learn more about Mischa at Evolve.Men Check out his upcoming course here: BeyondConsent.love Find him on Instagram at @mByruck Come to Shameless Sex Podcast's live "G-spots, P-spots and Backdoor-lovin'" class in Salt Lake City on Saturday, Feb 25th! Purchase tickets at Blue Boutique's Sugarhouse location: blueboutique.com Or join us for the event online by registering here: https://fb.me/e/4aTb53igO Other links: Learn more about the American Sex podcast here: sunnymegatron.com Get 15% off your ultimate sexy holiday kit with code SHAMELESS at likeakitten.com/shameless Get turned on with 30 days free of super hot audio erotica at dipseastories.com/shameless Get 10% off + free shipping with code SHAMELESS on Uberlube AKA our favorite lubricant at uberlube.com Get 10% off while mastering the art of pleasure at OMGyes.com/shameless Get 15% off all of your sex toys with code SHAMELESSSEX at purepleasureshop.com
Amy and Afua will be kicking off the Nonprofit Social Media Summit with their keynote: The Tech That Comes Next! Sign up here!Changing the way we use, develop, and fund technology for social change is possible, and it starts with you.My guests in this special bonus podcast episode are Amy Sample Ward and Afua Bruce, authors of The Tech That Comes Next: How Changemakers, Philanthropists, and Technologists Can Build an Equitable World . This important book outlines a vision of a more equitable and just world along with practical steps to creating it, appropriately leveraging technology along the way.AMY SAMPLE WARD (they/them), believes that technology should be accessible and accountable to everyone, especially communities historically and systemically excluded from the digital world. They are the CEO of NTEN, a nonprofit creating a world where missions and movements are more successful through the skillful and equitable use of technology. Amy's second book, Social Change Anytime Everywhere, was a Terry McAdam Book Award finalist.AFUA BRUCE (she/hers) is a leading public interest technologist who has spent her career working at the intersection of technology, policy, and society. Her career has spanned the government, non-profit, private, and academic sectors, as she has held senior science and technology positions at DataKind, the White House, the FBI, and IBM. Afua has a bachelor's degree in computer engineering, as well as an MBA.Connect with Amy and Afua:Twitter: @amyrsward @afua_bruceWebsite: https://thetechthatcomesnext.com/About Julia Campbell, the host of the Nonprofit Nation podcast:Named as a top thought leader by Forbes and BizTech Magazine, Julia Campbell (she/hers) is an author, coach, and speaker on a mission to make the digital world a better place.She wrote her book, Storytelling in the Digital Age: A Guide for Nonprofits, as a roadmap for social change agents who want to build movements using engaging digital storytelling techniques. Her second book, How to Build and Mobilize a Social Media Community for Your Nonprofit, was published in 2020 as a call-to-arms for mission-driven organizations to use the power of social media to build movements.Julia's online courses, webinars, and keynote talks have helped hundreds of nonprofits make the shift to digital thinking and how to do effective marketing in the digital age.Take Julia's free nonprofit masterclass, 3 Must-Have Elements of Social Media That Converts
We talked about: Christine's Background Private sector vs Public sector Public policy The challenges of being a community organizer How public policy relates to political science Programs that teach data science for public policy Data science for public policy vs regular data science The importance of ethical data science in public policy How data science in social impact project differs from other projects Other resources to learn about data science for public policy Challenges with getting data in data science for public policy The problems with accessing public datasets about recycling Christine's potential projects after Master's degree Gender inequality in STEM fields Corporate responsibility and why organizations need social impact data scientists What you need to start making a social impact with data science 80,000 hours Other use cases for public policy data science Coffee, Ethics & AI Finding Christine online Links: Explore some Data Science for Social Good projects: http://www.dssgfellowship.org/projects/ Bi-weekly Ethics in AI Coffee Chat: https://www.meetup.com/coffee-ethics-ai/ Make a Social Impact with your Job: https://tinyurl.com/80khours Course in Data Ethics: https://ethics.fast.ai/ Data Science for Social Good Berlin: https://dssg-berlin.org/ CorrelAid: https://correlaid.org/ DataKind: https://www.datakind.org/ Christine's LinkedIn: https://www.linkedin.com/in/christinecepelak/ Christine's Twitter: https://twitter.com/CLcep MLOps Zoomcamp: https://github.com/DataTalksClub/mlops-zoomcamp Join DataTalks.Club: https://datatalks.club/slack.html Our events: https://datatalks.club/events.html
This episode features Mitali Ayyangar the portfolio manager for frontline health systems at DataKind. DataKind works with mission-driven organizations to unlock their data science potential ethically and responsibly. In this episode, Mitali talks about innovative data science tools that are increasing the confidence that governments and their implementing partners have in the quality of data that is collected at the frontlines of the healthcare system. You can learn more about DataKind by visiting: https://www.datakind.org/ If you have any thoughts on this episode, or recommendations of African health innovators that you'd like me to host on the show, please reach out to me directly on Twitter @DrSam_Oti, email: sam.oti@alumni.harvard.edu or via LinkedIn https://www.linkedin.com/in/samuel-oji-oti. Please note that The MedxTek Africa Podcast is distinct from Dr. Oti's role as a Senior Program Specialist at Canada's International Development Research Centre. The information provided in this podcast is not medical advice, nor should it be construed or applied as a replacement for medical advice. The MedxTek Africa Podcast, its production team, guests and partners assume no liability for the application of the podcast's content.
Hub & Spoken: Data | Analytics | Chief Data Officer | CDO | Strategy
In this podcast, Jason Foster talks to Giselle Cory, CEO of DataKind UK - a charity that supports social sector organisations to use data science. They talk about DataKind's mission, how data has provided valuable insights to different social organisations, how to build trust with data and the importance of volunteering data skills to help social change organisations.
Without a clearly defined methodology, complex projects with multiple technical and business stakeholders often fall apart. The risk is especially high when trying to scale data science work in an enterprise organization. That's why David Von Dollen, Head of AI at Volkswagen of America, integrated agile methodology with CRISP-DM to help his team navigate roadblocks and accelerate progress on the path to model deployment. He shares how this hybrid approach enables his team to be more strategic about project lifecycles, unlocking real business impact even faster. Plus, David provides advice for building relationships with key business stakeholders and shares his philosophy on using the art of data science to benefit humanity. We discuss: Implementing hybrid CRISP-DM and agile methodologies Building relationships with stakeholders across the business Using data science to solve challenges outside of work Mentioned during the show: DataKind Tune in on Apple Podcasts, Spotify, our website, or wherever you listen to podcasts. Can't see the links above? Just visit domino.buzz/podcast for helpful links from each episode.
Lauren Woodman, the tech-for-good veteran who was recently announced CEO of DataKind, joins me on Tech Talks Daily to discuss the future of the Data for Good movement. DataKind is a global non-profit organization that leverages data science in the service of humanity. They are unique for pushing the boundaries of what is possible when social and humanitarian causes leverage advanced technology to enact positive change in the world. Woodman has over 25 years of experience in the technology and non-profit sectors including roles as congressional liaison at the UN and GM of Microsoft's global education programs. Most recently, she was the CEO at NetHope where she awarded $190 million in technology capabilities and infrastructure to humanitarian organizations. In this episode, we discuss her experience as a pioneer in the tech policy and non-profit space. I also learn more about the current data projects DataKind is working on using AI solutions and its social impact.
There's a lot of conversation happening about the ethical uses of data and statistics how much weight should we put on numbers at all? How thoroughly should we investigate the methodologies used to create them? And who has access to the data? A special issue of chance focuses on statistics and data science for good and that is the topic of this episode of Stats and Stories with guests Caitlin Augustin and Matt Brems. Caitlin Augustin (@augustincaitlin) is a Senior Director at Datakind and is responsible for delivering DataKind's core offerings, ensuring that high-quality, impactful data science interventions are created in partnership with social sector leaders. Prior to DataKind, Caitlin worked as a research scientist at a digital education company and as an engineering professor at NYU. A lifelong volunteer, she's engaged with Central Florida's nonprofit community and the organizer of the Orlando Lady Developers Meetup. Matt Brems is Managing Partner + Principal Data Scientist at BetaVector, a data science consultancy. His full-time professional data work spans computer vision, finance, education, consumer-packaged goods, and politics and he earned General Assembly's 2019 "Distinguished Faculty Member of the Year" award. Matt is passionate about mentoring folks in data and tech careers and volunteers for Statistics Without Borders as well as currently serves on their Executive Committee as the Marketing & Communications Director.
When Jen Stirrup speaks, she speaks softly. The meaning of her words, however, speak loudly! Jen is CEO of Data Relish, a UK-based consultancy that delivers real business value through solving all manner of business challenges. You don't earn the nickname the Data Whisperer without knowing a great deal about Business Intelligence and AI. Jen certainly knows not only those topics, she knows SO much more! References in this episode: Data Kind The Art Of War Blade Runner Tears Scene Episode Timeline: 4:30 - The human element of data, Bias in data, implications of Artificial Intelligence and Machine Learning, and COVID data 27:00 - The BI goal is Business Improvement, escalation and taking principled stands, Data-Driven vs Data Inspired 46:00 - Seeing the hidden costs of some business strategies, the value of even small successes, Diversity and Inclusion, and online bullying 1:29:30 - Jen's mugging story (!) Episode Transcript: Rob Collie (00:00:00): Hello friends. Today's guest is Jen Stirrup. Jen and I have had one of those long-running internet friendships that are so common these days, especially in the data world and in certain communities. But we've also had the opportunity to meet in person several times at those things that we used to do called "in-person physical conferences." She's an incredibly well-seasoned veteran of the data world, but if you're expecting us to be talking about things like star schema and DAX Optimization, that's not really what we talked about. You know that our tagline here is "data with the human element," and we definitely leaned into that human element in today's show. Now, we do talk about some of the important human dynamics about data projects. For example, how the business intelligence industry kind of lost its way in the past and forgot that it's all about improvement and how we're as an industry waking back up to that today. Rob Collie (00:00:54): We also talked about the value of having even one signature success in a large organization that other people can look at to become inspired. And she has some very interesting and well-founded semantic opinions about terms like "data-driven" and why maybe, "data-inspired" is better. Similarly, she prefers the term "data fluent" to "data literate", and she explains why. But we also touched repeatedly on the themes of ethics and inclusivity in the world of data. Now, I have a personal idea that I haven't really shared on this show before that I call "radical moderation." It's the idea that you can be polite, you can be reasonable, while at the same time advocating for sharp change. Now, this is personally what I would like to see emerge in our political sphere, for instance, a form of polite radicalism. We need to change, but we need to be nice. Rob Collie (00:01:52): There aren't many readily available examples that I could point to if I wanted to show you "this is what radical moderation looks like." But now if someone asked me for that, I can point them to this conversation we have with Jen. She is soft-spoken, she is polite, she is open-minded, including the open-mindedness that she might not always be correct. And yet, underneath all of that, is a very firm conviction that we need to be better. And I think that's the best introduction I can give this because I don't want to spoil anything upfront. So, let's get into it. Announcer (00:02:28): Ladies and gentleman, may I have your attention please? Announcer (00:02:32): This is the Raw Data By P3 Adaptive Podcast, with your host, Rob Collie, and your co-host, Thomas you know. Find out what the experts at P3 Adaptive can do for your business. Just go to p3adaptive.com. Raw Data By P3 Adaptive is data...with the human element. Rob Collie (00:02:56): Welcome to the show, Jen Stirrup. It is such a pleasure to see you again, virtually, talk to you. I'm really glad we were able to do this So, thrilled to have you here. Jen Stirrup (00:03:06): Thank you so much for having me. I'm glad we made it work in the end. Diaries, schedules, everything else, but I'm really glad to be here and it's great to speak to you. Rob Collie (00:03:15): I know bits and pieces of the Jen Stirrup story and I know bits and pieces of what you're up to. How do you describe yourself on your LinkedIn profile? Jen Stirrup (00:03:23): So I would describe myself as really trying to help people make their data better. I've just finished a post- COVID data strategy for a healthcare organization in the US and in the UK. The reason I'm doing that is to try and have a big impact. I believe in that, I think COVID has brought around a real stress and a lot of technical architectures, and a lot of data architectures as well, and there're all sorts of pressures. So I've just finished that, which has been a nice piece of work. I've been working with a religious organization on their data as well. A lot of people are accessing their services as part of a recovery from COVID. I think it's been a very difficult, challenging time for a lot of people in terms of mental health, and I like to think that by solving these problems you're actually helping people, in a way to contact, some of whom you may never meet, but that's okay. That's really what I like to do, I think, it's a way of connecting, I think. Rob Collie (00:04:22): We subtitled the show 'Data With The Human Element,' you think of the data field is like this cold, analytical, sanitary, and it's not, right? If you're doing it right, you're having an impact in the human plane, and it's a leveraged impact because you can really sort of touch a lot of people's lives via the central hub that is data. And you've got to keep the human beings in mind, even to be successful at the quote-on-quote "cold, calculating data stuff." If you don't keep the humans sort of first and foremost in your mind, you're not going to design, for example, a good data strategy, like what you just finished. Jen Stirrup (00:05:02): That's right. So I believe that the information ladder is quite important. So we start off with data, then we need to turn that into information, but then we need to turn it into knowledge and then wisdom. And I think COVID has taught us many things. I think it's maybe taught us a sense of purpose, it's something that can help drive all of us. Data can be part of that and I think that data in some ways has been replacing some of the bigger-purpose questions that perhaps we should ask ourselves more often as human beings. With artificial intelligence, particularly, I'm finding that people are replacing data with, perhaps, information, knowledge, or wisdom and say "what does the data see?" and that's fine, but we have to have the context to the data as well. Jen Stirrup (00:05:47): I think in some ways with artificial intelligence, what people are trying to do is build a little box of data and it's becoming this oracle that people are going to touch and say: "So, what does the data say?" It's like we are taking this box and we're trying to turn into some sort of God that we can touch, and it's going to give us all the answers, but if we're going to do that, it has to be a God that we are comfortable to live with, and it's one that we can choose, and one that fits in with people's ethics and their sense of purpose. So, I see data as part of fitting something that can make us all better in so many different ways, whether that is healing or bringing people together. Jen Stirrup (00:06:29): So I think if we could solve these problems where people are feeling that they are not interconnected, then we could start to try and look at that and perhaps think about making people feel whole and feel more together. Because I think what COVID has done is really helped us to focus a lot on data but perhaps not about how we could do things better. It seems that we have an opportunity to decide what goes back in to make the new normal or the next normal. And I'm worried I suppose that I don't see that happening as much as I would like. So yeah, data is important. Absolutely. We wouldn't be here without it and the fact people are struggling with it does pay my mortgage. I still would like us to ask ourselves the bigger questions as well as something that's important to me. Rob Collie (00:07:14): Let me check here. Oh yeah yup, it pays my mortgage as well. We're here for a reason that's for sure. I loved you talking about the AI, this box, that we're going to sort of elevate to the status of a God or that's how a lot of people are viewing it subconsciously. Of course, it's a box that we built. Jen Stirrup (00:07:33): Yeah. Rob Collie (00:07:33): We fed it with our context. It got fed with our assumptions and also our blind spots and now if it makes decisions, that thing starts making judgments and decisions that impact people's lives. It's a tricky proposition, it's one that's best approached very carefully. Jen Stirrup (00:07:55): I agree and I think that's why the bigger questions are important. So say for example, you may have seen the Netflix information series. It was called 'The Social Hack' or something like that. I've forgotten the name, but it was talking about the role of bias in data. One of the researchers found that their facial recognition algorithm didn't recognize a face. And the reason for that was that she's black and for me, I just thought, that's such a preventable issue and how much time do you spend looking at preventable issues? And perhaps not very much. I still see the magpie problem a lot in technology. Companies are happier buying a new technology that they see that's going to solve all their problems, but actually it's not doing that. It's maybe replacing as a bad answer to a different question. We can't see that right now in artificial intelligence. Jen Stirrup (00:08:48): There's some research going on, which will decrease the size of data sets that AI needs in order to create its algorithms and that sounds fine. It's a good piece of research, but what I'd like to see is more researches on collating datasets which are less biased, so that we can think about focusing and trying to make the algorithms fewer rather than focusing on making them smaller. Jen Stirrup (00:09:13): I know a few years ago, you probably remember, everyone talked about big data. Big data was the thing but we didn't ask ourselves if this was the right data. It might be big, but if it's missing out large sections of the population, then that's building an inequality before we get started. I think, even if you don't have the answers, asking these questions is a good thing. I don't have all the answers. There's people working in this field much much smarter than me and they all live and breathe this stuff and I read it, the things that they're doing and talking about, and I think this is such an important part of what we do every day. I think it's really important. I don't know what you think, but there's so much going on in the world of data at the moment that it feels hard to keep up sometimes. Thomas Larock (00:09:58): So first I want you both to remember in case you've forgotten, but you can purchase the Azure Data Box, that does exist. Rob Collie (00:10:07): We will just call it God in a box. Thomas Larock (00:10:09): Azure Data Box, it's actually for shipping storage to an Azure data center, but that's what they chose to call it and I said: "You put your data in the box or it gets the hose again." Right? So- Rob Collie (00:10:20): No no, Tom, it's one: "Put your data in the box." Thomas Larock (00:10:26): So, I mean, that does exist. The first point I wanted to make that you danced around, like Rob you were talking about how we're building this thing and it comes with all of our failings. And I know Jen, she leads discussions on diversity, inclusion, equality and I try to emphasize why that's so much more important and especially seeing the rise and I saw the Netflix special as well, and the Data Justice League. The idea is we need to have those programs in order to have better models. We have to be aware of the bias inherent in the stuff that has already been built. And I think there's a lot more awareness over the last 18 months regarding the products that are on the market that are already failing us because they were built with these biases. And that's a difficult thing to overcome now that you have police departments or governments deploying this technology, thinking, as Jen said, it's this God that is just going to give you all the answers. Thomas Larock (00:11:35): Jen, you also hinted on the thing about the question. So, you're replacing one problem with another, and that made me think of how vital it is that you understand the question you need answered and a lot of times that gets kind of shifted, it's fluid almost. It's like: "Oh, well we were doing this thing we think this next thing we'll solve for it." But the next thing you're getting is actually answering a completely different question than what you thought you were doing and it leads to a huge, huge disconnect. And I think the last thing I would say Jen, I've seen that research about the data sets. I'm encouraged by the idea that we could get people to understand that it's not the volume of data that makes a better model. It's the data that was chosen to be collected in the manner in which it's collected. Thomas Larock (00:12:30): So I know the research on building these models and they're saying: "Yeah, you don't need a billion rows. The accuracy tails off at some point after, say, a million rows." At some point more data doesn't make this model any more accurate but the inherent problem is how was it collected? What were the biases and how was it collected? What was missing? Was it missing at random? Was it missing not at random? The analysis necessary to conduct that research, I think is where we are sorely lacking in business. I know it exists in academia, but those people, they don't scale. There's only so many of those, and there's a lot more businesses trying to get the job done so I think that's fairly important. Jen Stirrup (00:13:13): There is a huge gap between academia and business. I guess there always has been, I do speak to academic institutions from time to time and it's clear that they are doing so much work. They really are, but how that is getting out? I am not sure. Maybe that's why they asked me to come and talk to them so I can talk to other people about what they're doing and I don't mind doing that. I think there needs to be more of that, because I think these scientists, these academics are working in this, have to get access to each other as well and the multidisciplinary aspect of it is really interesting. I did a Postgraduate in Cognitive Science about 20 years ago, and suddenly it's back round again, and it's about philosophy, linguistics, psychology, AI. And why did that go away? Jen Stirrup (00:14:03): It should never have really gone away. I think we got as an industry perhaps Goldstone and such technologies which these things were re-badged as, and we got derailed by the marketing efforts. But I think that there's real room for doing these things in a better way. I don't know if you see this, but I see, or maybe it's my age now, I've been around in the industry for a long time, but I see that people are doing and making mistakes that I first saw 20 years ago, data collection, which you rarely mentioned, Tom, that's been there for a long time and then it seemed to go away. Jen Stirrup (00:14:36): I think that's why academia does help because it gives us maybe more of that consistent backgrounds than perhaps we get from marketing noise, which was goes round in cycles and trends as people are under pressure to purchase these licenses or whatever it happens to be. I wish I had better answers for all of this, I think sometimes it's about just asking these questions, blogging, talking about them, putting them on social media so that when people are thinking, "what do I do about data strategy?" That these things are part of this. I saw a study recently saying that companies are decreasingly likely to include ethics and these questions and bigger societal questions as part of the data strategies as you're trying to get the link. But it disheartens me because I thought I could see that the voices are getting squeezed out. Rob Collie (00:15:25): Decreasingly likely, like we're trending- Jen Stirrup (00:15:28): Trending down. Rob Collie (00:15:28): You know, it'd be one thing to be flat, right? I mean that would also be disheartening, but to be decreasing, decreasingly likely to be factoring in ethics into a data strategy. Now we've been talking a lot and I think it's a good thing to continue to talk about the implications of AI and machine learning in this space, the business intelligence industry isn't particularly fraught with this kind of problem, right. Transactions happened, or they didn't, you know, and it was the number of six or a seven. I mean like, you can get it wrong, you can have bugs, right. But there isn't any like objective debate about what, there shouldn't be any way about what actually has happened. But the decider systems, are a completely different game, like where should we route this patient? This is going to have a huge impact on their life. Rob Collie (00:16:21): That's a very, very, very different game and we've been talking about sort of like, the completeness of the data that is used to train these systems, but I think it's really instructive just to stop for a moment and go, you know what, even if we were able to feed these systems a 100% comprehensive picture of today's world, we still have to accept the fact that we're telling it that today's world is what we want. Right. And maybe we don't, you know and there's always a judgment in training these systems, we tell it what is a success and what isn't a success. Our unintentional biases can leak into this stuff in a million different places, even if you suddenly had God-like comprehensive powers to feed it, quote-on-quote, all the data, right. It's still leaky. It's still fraught. Jen Stirrup (00:17:13): Yeah and actually, I think it's an extension of their problem that we see just when we're building a data warehouse. Sometimes I'll go into a customer and they'll say, "you know, we want to see our data and see our latest vendor here," and then I'll say, "well, is it preserving the data or is it just, you know, been reamed out the other end, what you're doing with it? Where you're storing it?" And then the argument against the data warehouse as well. It's not going to capture everything in the possible universe of possibilities in my business, so I don't want to do it. And I find the argument goes something like, "there's an edge case that it won't cover." Others, "this edge case, it won't cover here." And then you have to say, "well, you know okay. So it's not going to cover all the possible edge cases, but it will cover 80% of what you need, and the rest, can go to shadow IT or shadow data systems or wherever they happen to be." Jen Stirrup (00:18:03): And I think we're still trying as it's a bigger picture perhaps trying to control everything that happens around our business, but we have to be flexible enough to cater for these scenarios. We haven't seen this before. I think that's what makes the AI so difficult actually, as we have more than one type of AI, we have a general artificial intelligence, which is more like Terminator, you know, these kinds of things. Rob Collie (00:18:29): Innocuous stuff like that. Thomas Larock (00:18:30): Harmless. What's the worst that could happen. Rob Collie (00:18:32): Yeah. I mean. Jen Stirrup (00:18:35): Well, I think as humans, we do enough damage to ourselves, most of the time we don't need a Skynet. Thomas Larock (00:18:38): That's true. I agree. That's often my reaction to, well you know, like self-driving cars, like what if it makes this mistake? Okay yeah but the human being track record behind the wheel, we're not trying to be perfect, we're just trying to be better than people, which is a little bit more achievable perhaps. Jen Stirrup (00:18:56): Exactly and it's all a bit context, which is how to program. You probably remember a few years ago, at SQLBits say Tom, Steve Wozniak visited. I don't know if you were there for that SQLBits but Steve Wozniak is one of the team that founded apple. You must know who he is, but he's talked to us about the Wozniak test for AI, the testers will have an artificial intelligence sought of robot come into your house and make you a coffee from scratch. Now that involves a lot of contextual knowledge. They have to find your kitchen, they have to get your ingredients and get a cup, you know all that kind of thing and that requires context. And that's more general AI, that's more difficult to program. But if we're to think with CEI being more successful for businesses automation productivity, and it's just trying to do something, one thing really, really well, something that will help a human to make better decisions faster. Jen Stirrup (00:19:51): Such as perhaps parceling out x-rays, which don't show any presence of a tumor as an example, but we then get the 10% of x-rays that makes sure something and passing those onto a human to look at. So there's plenty of rooms for defining what success looks like for us for artificial intelligence I think. With business intelligence, your right, we should have one version of the truth. People are still living so much in Excel and Google sheets and things of empires away, and that are sitting in their laptop. How do you move that to the cloud? So you move them perhaps to office 365 or a Google work space, and then you're trying to encourage people to rethink the processes about, Hey why do we save stuff in the cloud? Or why do we make our decision making more apparent? And it seems a bit difficult to ask AI to make its decision-making more apparent, when actually a lot of people spend time hiding or umpiring the knowledge anyway. Jen Stirrup (00:20:49): I don't know if you think this, but I often think business intelligence problems are change management problems in disguise. It just happens to be showing up in the data that there's a problem. Thomas Larock (00:20:59): Yeah. Rob Collie (00:20:59): Ultimately it's not about knowing, it's about improving. Knowing that there's a problem and even knowing what's causing it is really just the beginning. Very often it's like okay, now what? This is going to be a really difficult problem to address operationally. Jen Stirrup (00:21:16): I think we forget the process of optimization and business intelligence. And I wonder if that's the reason why AI is becoming so prevalent at the moment, because it is much more clearly talking about optimizing and improving processes and automating. I think in business intelligence, we have almost stopped talking about optimizing business processes. I don't see it quite as much, I wonder if we get sort of caught up in data visualization, you know Tableau came along and then power BI and everyone started chasing after that. We're perhaps forgetting that actually we're doing all that for a purpose, which is to make something better somewhere. I don't know if you find this but, I obviously run [inaudible 00:21:54] business and it's very hard to get customers to agree to a case study because they don't want to show that actually they were in a bad place and they don't want to show the competitors that they were in a bad place. Everyone's ashamed of the data. So it's really tough. Rob Collie (00:22:07): I've seen sort of multiple facets of that. So first of all, yes, everyone thinks that they are uniquely broken, everyone's organization that they feel a level of sort of like discomfort and shame about where they're at today or where they were yesterday. They feel like they're the only ones, but we see so many organizations per year, especially the kinds of projects and the pace at which we move the world is very much uniformly broken. No one's really behind, everyone's way behind of where you'd sort of like as a dispassionate observer, you'd expect people to be a lot further ahead than they are, but no, no, really the basics are still not sorted out universally. We're still kind of in a dark age, in a way. Jen Stirrup (00:22:51): Yeah. Something, I see really basic issues of one customer example of talking about where they were calculating the mean incorrectly for two years. And then two years before that, for another two years, they were calculating the median incorrectly in Excel. What they were doing was it were taking the middle value of a column. So of course, if you sorted the column next to it, the value changed. And they said that that was the median. And I said, "okay, so you've got a column of 20 items. Are you telling me that whatever's a number 10 is the mean?" And they said, "well, yes, that's in column B." What happens if you change the order in column E from perhaps alphabetical order to reverse alphabet order, the values can be changed, right? And they looked at me and I said, "why did you calculate it like that?" Jen Stirrup (00:23:41): And they said, cause we can calculate the mean using Excel formula. So eventually I said, "why are you using the mean," because it's quite sensitive to outliers the median's better. and then they said, "well we've tried that but we couldn't calculate the median either." I said, so okay "for four years you've been trying to calculate the mean and the median incorrectly in this one spreadsheet. Can you tell me about the rest of your spreadsheets? How often are you trying to use the median or the mean all of it incorrectly?" And I think it's probably the only time in my 20 plus year career, I've seen a customer actually punch himself in the face and it was just absolutely stunning. And he said, "I'll go and speak to the statisticians." And I thought, you've got statisticians working here. I'd love to meet them. Jen Stirrup (00:24:26): I wonder what they're telling you. And that was my second deal in sight, I was on the on and off for six months. And that was just the first problem I found. So I know we talked about data literacy. I'm not a fan of that phrase. I prefer fluency or something along those lines. So I don't want to assume people are data illiterate. Because I don't think that they are, I think we're born naturally within us an innate sense of numbers in a way, we can tell more from less, right? My dog can do it, right. So if I got five treats in my hand, he knows I've got others. If I just give them one, he's not stupid, he has a sense of quantity. And I think it's about, we need to get better in industry, perhaps explaining results, findings, conclusions, and context to people instead of just throwing dashboards at people and expecting them to understand it. Jen Stirrup (00:25:16): If somebody recently sent me a scientific article which was all about COVID and some testing that they did in mice, and I could read it, but I couldn't understand it because I don't have a background in medicine. I read the abstracts and I read the last paragraph and the first paragraph, but I didn't read the rest of it because I thought this is way beyond me. I don't understand what they're trying to say. But I think for me that highlighted a problem with data literacy, I could read it, I couldn't understand it, and I certainly couldn't act on it. And I don't want to give other people who are trying to consume business intelligence products in some way, whether they're dashboards or even dumps from Excel, that they just don't understand what they're getting. How we do that, I think is perhaps focusing in data translation. Jen Stirrup (00:26:03): How we do that, I think, is perhaps focusing in data translation. I had a woman who worked for me, she actually was a qualified librarian. So, her insights about information retrieval were very interesting. I learned a lot from her, because that was a little bit the data. And she would say things like, "Jennifer, Google is not the only search facility in the world. We can use so much more," because she's accessed all their library systems around the world. And there's so much information we don't access because we can't, usually. But the point being that what I learned from her was about translating things, where they were easier to understand for other people. And I think it's an incredibly valuable lesson, and the world needs more librarians. Rob Collie (00:26:43): There's a lot here, right? Business intelligence was always a means to an end, but because it was so difficult, it was just so incredibly difficult to even get a halfway-competent system instilled, built, configured. When something is that hard for that long, it becomes its own goal after a while. It's easy to habituate to the idea that this is the goal, intelligence is the goal, knowledge is the goal. No, no, no. Improvement was always the goal. What's really been fascinating for us is, when we see our clients, the people we work with, when we see them start to get the BI problem under control for the first time ever, their gaze immediately sort of zooms back and they start thinking completely unbidden by us. We don't have to seed this conversation. It just happens. They start looking at the bigger picture now and going, "Oh, okay. So, now this information needs to feed into better decision loops and optimization and things like that. And how do we facilitate that?" Rob Collie (00:27:53): And from the beginning, we try to counsel everything being built around that "taking action" thing. You can build an incredibly informative dashboard that is intelligent, it's a work of art in many ways, on many levels, and it can be useless. It can be factual, it can be impressive, and it can be useless because you can't use it to make any improved decisions. I've been guilty of this. I have built things like this, like, "Ta-da." And the client doesn't even have the language to push back. Jen Stirrup (00:28:30): It's something I've tried to keep in mind now is the utility of what I'm actually doing, because people just want data for the sake of data, and they get that. I think, sometimes, they don't know what to ask for, so they take something because it's better than nothing. And they'll say things like, "Right, I want the last five years of data and 191 columns, I want it all on the same page, and I want to be able to print it." And then you have to say, "Well, let's think about how feasible that is. You'll get five years of data, it's not going to fit in one page. 191 columns is going to be really small. So, let's have a..." People ask that because they don't know what they want. Jen Stirrup (00:29:06): About a dashboard recently, a health and safety dashboard, it was using power apps as well. So, the company, if they saw a health and safety priority issue, they could use the app, if they were health and safety professionals, and the app would record data, you could upload a photograph, and then that would go into a system which you could then see in Power BI. And the nice thing about that was you could see improvements over time because people could get their health and safety issues resolved more quickly, so things like boxes stacked against fire exits, slip and trip hazards. Jen Stirrup (00:29:43): Now, it may not seem very interesting, but actually, the reason that project had happened was because someone that had been in a health and safety incident and it had not been tracked properly, and the idea being that they were trying to improve the process. But sometimes, I think data problems and data solutions happen because of two things. One is you need an executive sponsor, and the second thing is a crisis. And together, the executive sponsor and the crisis will engender change somewhere. And that change management process so often turns into a business intelligence solution. And nothing is an industry. It's something I'm personally trying to always keep in mind is: what's the purpose? What's the optimization? What problem am I trying to solve? Rob Collie (00:30:30): Yeah, one time, I was asked by a client to help debug a report that was really slow. So, this is great because this is an example of a report that I didn't build, right? I can use an example that wasn't one of my own families, but I'll tell my own as well if you want. But I go, "Okay, I'll take a look at it." I'm expecting some sort of DAX or data modeling problem or something like that. And they show me the report, and it is a 100,000-row pivot table. The pivot table has a 100,000 rows in it. There's DAX behind it. It's a DAX data model behind the scenes, but the report itself, the output is 100,000 rows. And before I even engage, I just turn and look at them and say, "Oh, my God, who was using this? You don't have a performance problem. It's..." And they're very insistent. "No, no, no, no, no. This is the thing. We need this." I'm like, "All right." Rob Collie (00:31:21): So, I start looking at it, and it's crazy how many columns there are. And it was a list of every employee and every location that they have in the country, which was hundreds of locations and thousands of employees. And for each employee, their scheduled time-in and their scheduled time-out, and their actual time clocked in and actual time clocked out. I turned back at him again and I go, "Okay, really? What are we doing here?" And they're like, "Okay. So, we have all these regional managers that are looking at this multiple times a day, probably eight times a day or more, to try to figure out if any of their stores are empty, aren't staffed because people didn't show up." And I just smacked my forehead and I go, "You don't need the timecard report," which is what they called this thing, the timecard report, "You need the empty store detector." Rob Collie (00:32:18): And I mean, there was no way to make this thing faster. I mean, this thing was such a gross misuse of technology. I just went to the whiteboard and I sketched what the empty store detector could look like, and they're like, "Oh, that's great. We'll never get our managers to switch over to using it, so let's just go back to fixing this other piece of junk." Jen Stirrup (00:32:37): Yeah, because something that I struggle with, personally, is the idea of surveyance reports. It's something that really bothers me. I've pushed back on a few customers to see, "Are you micromanaging or are you surveying? What is it you're trying to do?" On occasions, I have escalated it to say, "Look, this report is probably been used to hit people for the head, and I'm not comfortable with this because I think this has gone beyond micromanaging." And we had set the scope of the project of the thing we were supposed to deliver. So, I'm going to escalate this because I want to understand better the purpose. And if I'm wrong, we will deliver it." Jen Stirrup (00:33:12): And normally, when I go back and see that, even in that particular instance, I showed the senior management and I said, "Your middle management want to do this." And they said, "No. We are not spending time doing that. We need to understand the wider context. If there is any issues going on with staffing, then this is probably a symptom rather than the cause of the issues, if people are being watched like that." So, I think some teams escalating, as much as I don't like to do it, sometimes is the best way forward. Rob Collie (00:33:44): It takes a lot of professional courage to do something like that. For example, have you ever taken one of those principled stands and ended up no longer working for that client because they basically fire you for not staying in your lane? That's a risk, right? Jen Stirrup (00:34:01): Yeah. It is. I've never been fired for that, but I have said, "Uncomfortable, and I'm we going to stop delivering services, and we need to decide on an exit strategy." There's different ways you can do that, right? So, you deal with the current project. You then say that you're busy for the next century when we come back to you for other work. I don't like doing that because I often feel like you should give them an alternative to say, "Well, here. I can't deliver it, but I know someone who can." And then I recommend one to my network. But the thing is, when I make these quite principal stands, people back down often, or they back down and they just asked me to do it. But when I've gone back to people like that customer, who come back to me for extra work, I've done some investigating work and I've found that they have not implemented a thing that I've been worried about or concerned about. Jen Stirrup (00:34:49): So, I think, sometimes, if you do speak up, people are maybe surprised by it. It's maybe different who it comes from. And I think, perhaps, even a soft Scottish accent, smiling sweetly at them and saying, "Can you explain to me a bit more about the reasoning behind this? Because your team want to do this thing, but I have some discomfort because it's outside scope." And they're not telling them, and they're very direct. Wait at first, but they start to get their message. Jen Stirrup (00:35:16): A former boss of mine years ago, he said I had a soft rein approach. I actually think that's a nice way of putting it, where, as much as I might be tempted to go in all guns blazing, I'm trying to gently bring it up and then bring it up again a bit more firmly, and then, suddenly, people are starting to understand better. But that's me having to probably, sometimes, exert a huge amount of self-control as well. But I think that's part of the consulting game. It's very tough. But I think seeing something like that happen, I think the reason it happens is because people aren't thinking about it longer-term. And me as a consultant, it's easier, perhaps, for me to think about it long-term and also a bit more closely as well, because you are thinking about the consequences of what you're trying to do, the purpose. Rob Collie (00:36:04): Yeah. If you're good at data and you're experience with it, you spend a lot of time with it, that allows you to put some of those things a little further down in the subconscious, and the rest of your human faculties can resume working, whereas, I think, for people who data is still this arcane thing, it's not the thing that they've spent their lives with, it's just really easy to get target-fixated on the data, data, data, data, right? "It's not about the people, we're trying to figure out the data," right? "And inform me," and all of that. Rob Collie (00:36:33): And I think it's like when you're first learning to drive, I couldn't have the radio on. The radio was really distracting. And you certainly couldn't have a conversation with someone next to you. So, all you can do just to make sure that you're turning the wheel the right amount and all this kind of stuff. It's just overwhelming. But once you internalize all that stuff and you build the muscle memory and all those sorts of things, now your brain is free to do some other things. Like this data fluency thing we were talking about, it's neat how, as you climb that slope, you're never there, it's a perpetual journey, the other parts of the equation like the human things, right? They can come back. Rob Collie (00:37:12): An example, even just from our own business, we do a lot of internet advertising. And sometimes, when people at our company are thinking about this, now the wrong way to do it is to go and like, "Oh, let's go look at the ad words API and let's get fascinated by the tech around this." And I'm always trying to remind people that, no, no, no, we're trying to scale a human interaction. That's what we're trying to do. We're trying to reach people with our humanity- Jen Stirrup (00:37:43): I think that's so true. Rob Collie (00:37:43): ... and we're using a technological system to do that. It's a tool for the other thing. Jen Stirrup (00:37:50): You're so right. I think we should be using technology empower and enable. And I think my personal mission is about helping people. I find that rewarding, personally. I like things with a purpose, so that's why I do charity work with organizations like DataKind, because when you get someone crying because you've solved a problem for them and you've helped them, you know how incredibly grateful they are. But I think, for me, that's why diversity and inclusion, equality, and intersectionality more recently has become really important to me. Jen Stirrup (00:38:21): I'll just give you a few examples that's in my head. I did a project recently, and there was a woman of color in my team, and I felt that she was being talked over. I'm used to being talk over, softly spoken. But I could see it with her. And I just made a conscious effort to say, "I'm sorry, but I don't think she's had the opportunity to speak, and I can see she's tried to have some input." So, some of it's a bit like that. But some of it is directly saying, "What do you think? Sorry, we haven't heard from you," and pulling people out. And you know what? She was and is still incredibly insightful. And sometimes, the best data scientists I work with are people who can't code. And I think about her and I think about another woman of color as well that I work beside. Jen Stirrup (00:39:06): Fantastic data scientists, they both know Excel, but they can't write a line of code. And the reason they're so good is because they are such fantastic questions. That means the rest of us who can code have to then go and get the answers. And I think the knack of asking the right questions is such a gift, it's such a skill, and it's something that I am consciously trying to improve myself on. And I think diversity, inclusion, and equality is really important, but we wouldn't get anywhere with any of that if we're not allowing people space either to talk or we're not able to give them the space to ask the right questions. Jen Stirrup (00:39:42): Now, I am constantly learning every day. And to do that, I'm having to learn to get better at asking questions. And it is a skill to ask, but I think, when we're dealing with data, it's about helping people not to feel stupid if they're asking questions, because I think, with these particular cases, it's very easy to feel diminished in a conversation where other people are understand the technology, they can code, you can't, but you've got an insight. I know we talk about data-driven, but I like the term "insights-inspired," and I wish we had more of that because that, I think, gives us room for other people who perhaps don't understand the technology but do have business insights that I would never get, because they help me interpret the code or the data to make it better. Thomas Larock (00:40:28): So, you said data-driven, but you prefer insights-inspired. I think those are still two different things because, when I think of data-driven, I actually think of that in terms of, "I'm going to make a decision based upon what the data's telling me, not upon my feelings." The insights-inspired, to me, is how I get to the question I want answered, right? But I'm still data-driven. I think there's some overlap, but I also think there's a lot of space there where they are distinct, because I do believe in data-driven because I've been in those meetings where somebody's like, "Yeah, I don't really care. We're going to do what I think is right." "But the data says something completely opposite." "Yeah. That doesn't matter to me." And lots of those cultures exist. I love insights-inspired, and I'm going to steal that. Jen Stirrup (00:41:16): That's fine. I think we need both, actually. I'm sorry if I wasn't clear. But you're right, there is a good impetus for people to think, "What does the data say?" And I like that. I think the "insights-inspired" piece will help us to understand if the data's right. And I'll give you an example of something that I did. So, I was doing some work for the national health service and there's some data missing for a hospital, and it was not an insignificant amount of data. It was for about five years, the data. And I searched for it all morning, and I was just about to ,arch down the corridor to go and corral a DBA to ask him, Have we lost any data? Because I cannot find this." Jen Stirrup (00:41:55): And then, when [inaudible 00:41:56] was passing, she said, "How are you doing?" I said, "Oh, have you ever worked at this hospital?" I won't mention which one it is. And she said, "Oh, I was there until it closed for five years and it merged with another hospital." And I thought, "Oh, you've just answered my question. Right." Because I was sweating beads because I thought, "We've lost five years' worth of data." And I thought, "We've done that. We are in so much trouble," because it's a lot of data. It's a lot of patient data. No, no, no, no. They went somewhere else. And there was a very good explanation that I would never have got by the data. I could have hugged her. Jen Stirrup (00:42:31): And to this day, I still feel the palpable relief, because I was walking in the hospital, thinking we need a really good explanation for this. But according to the data, it was not there. So, I think, when I look at data-driven, I think they're two sides of the same coin, because insights will tell you what the nurse said, "Well, actually, it's like this," and they will add to the interpretation. Jen Stirrup (00:42:54): I just sat in a meeting once where one of the leaders said, "All right. So, we've got the data now?" I said, "Yes, everything's fine." And in front of four of his team members, he said, "So, we can get rid of the business analysts then, because we've got the data now." And even when I mention this, I still, at this point, feel my blood pressure rising, which is not good for me. I am well over the age of 40. And actually, I was stunned. I said, "How are you going to understand the data if you don't have your business analysts. Who's going to tell you what it means? "Oh." I said, "Are you really thinking that you can just throw your data at a wall, see what sticks, see what's left, and that's going to drive a business? Because, pretty much, that's what you're doing, if you are not involving the people who understand the business." Jen Stirrup (00:43:43): And after the meeting, I mean, some of them were crying, saying, "He was talking about me losing my job." And the people impact was terrible. So, this is where I've got my principals coming in. So, I went and I escalated that afternoon, and he was taken off the project the next day. That was due to happen. That was just outrageous. And if any of you who are listening and this is you, I love that team, their insights were incredible and I learned so much from them. And to the leader in that organization, please listen to your team members. You will get so many many great insights. Rob Collie (00:44:23): Wow. Jen Stirrup (00:44:24): Sorry, this is very cathartic for me. I'm glad you've brought me on today. Rob Collie (00:44:33): I mean, just watching your face as you told that story, I can see the emotions that you're feeling, right? Jen Stirrup (00:44:37): He's going to get this. Rob Collie (00:44:38): And it's a mix, right? It's a mix of the beauty of some of these people that you worked with, right? Contrasting with like this horrible, horrible attitude, at the same time, from this one individual. When you have all those feelings at the same time, it's like you need a new name for it. It's like, "What is this feeling?" Jen Stirrup (00:44:56): And I think the industry is like a pendulum, so we go towards data-driven. And for some organizations, they need good data-driven, so Tom's given a great example. But sometimes, it goes too far and they say, "Yeah, I read that buzzword. I'm going to do that." And then, there's an expense, something has to give. And that, unfortunately, was his team. Like you said earlier, Rob, it's about the people. We should be there to help people by helping people do their jobs better, not necessarily replacing them. That was not ever on the menu. Rob Collie (00:45:29): Yeah. It's counterintuitive. Sometimes, when your data system gets better, the right move is to have more analysts because there's more ROI in having them. Even just hiring a data professional services firm such as yourself, the reason to do it is because the ROI can be massive. Jen Stirrup (00:45:51): Yes. There's lots of unseen costs. I worked with an accountant last year who spent four out of five days a week merging Excel together. And I sat with her, I got to know her pretty well, I mean, remotely because of COVID. And eventually, she said, "Oh, I'm looking for a new job." And I said, "Oh, really?" And she said, "I did not incur a graduate debt to sit and do something that I could have done without my degree." She'd put a lot of effort and, same in the US, lots of student loans to do a degree. And she said, "Technically, my job title is accountant, but I'm not accounting. I am munging data around in Excel." And one of the projects I had recommended was data integration, right? And they wouldn't go forward it. They kept saying, "No, no, no. We've always done it this way. So-and-so om accounts does all that." But they never asked her what she wanted. Jen Stirrup (00:46:43): So, she left, and I was not a bit surprised because she said, "I want to be an accountant. I want to account." And I know that it's not my personal lifestyle. It wouldn't be my choice of a job, but for her, she just loved that, and she wasn't getting to do. So, sometimes, the causes are quite unseen if you're not looking after the processes or the data, because that incurs hiring costs, then, on staff onboarding costs that don't get included often as part of these business strategy projects. When I'm doing a data strategy, I try to include them, to say, "But what happens if you change? But what happens if you don't?" And you're going to lose people because your people, very often, want to be skilled in the later technology. Jen Stirrup (00:47:25): And I'll give you an example. One customer I worked with said to me, "We need your help with reporting services, SQL server." So, "Okay, good. I like reporting services." Then, they talked to me and I said, "What version are you using?" And they said, "2005." And I said, "Why?" "Because the application that's using it requires SQL server 2005 and we can't upgrade." Said, "So, what was the application written in?" "VB6," which you may have heard of that technology. It was around in 1999. It was last century. So, the data state was antique. I had no idea that it was that bad. But then, the application came up, and Microsoft still do a version of a Visual Basic. You can go to the site, the latest version... But the point being that the staff and that place had settled for VB6, they'd settled for 2005. That doesn't mean that you're getting the best team members. And when we worked, it was recommended an architecture. Said it was not touching it with our [inaudible 00:48:30]. Rob Collie (00:48:30): I'm still very fluent in VBA6, so maybe after we finished this show, can you give me the information of this organization? I might go apply. The last place on earth that VBA6 fluency is... Actually, that's not true. It's still being used everywhere. It's just not being used centrally. Jen Stirrup (00:48:53): Yes. I did say to them, "I am not touching any software that was not built in this century. So, if it's in the last century, you've no chance." So, re-architected, actually, we're using the Azure Cosmos... Thomas Larock (00:49:04): It's a good rule. Jen Stirrup (00:49:05): ... and dot... Yeah, it's a good rule. It's a rule to live by, you can quote me on that. I use no software built in the last century. In fact, I'm going to make that my new company advertising strapline. That's great. I like that. So, they're happily in Cosmos and .NET. And we used that because the developer said, "Hey, does that mean we get to modernize?" I said, "Yes. And you will either modernize or I will leave. Your bosses are going to have to modernize." So, they did. But again, that soft Scottish accent comes up. "Well, why don't we use software that's built in this century?" Rob Collie (00:49:42): It's a devastating maneuver. If we were making a card for you in a trading card game, that would be one of your two power moves, right? Soft Scottish accent. And the description of the power is something like, "Removes all defensive screen cards from opponent." Thomas Larock (00:50:07): Disarming. Jen Stirrup (00:50:10): Absolutely. Yeah. It's just funny how the data problems are really throwing up what's wrong with the organization. Obviously, they did that, but two years ago, I went to visit them again, just before COVID last year. They'd implemented a data science team and they just wanted some strategic consulting. And I was really pleased with how they turned around. So, sometimes, if you just find a problem like that, a small success, building those small successes, and they were allowed to up. I don't know if you see this, but big thing of what I'm doing when I'm in organizations is change management, but also a lot of that's people. And people tend to align themselves with success. So, if you can just show one small success, people get on board with it. Rob Collie (00:50:53): Yeah. I mean, it's everywhere in humanity, right? We're fundamentally pattern-matchers. And if you haven't given a population any positive patterns to match, no examples, it's amazing how stuck you can be. But one success, right? We have an infinite percentage increase in our population of successful examples. We went from zero to one. Like you say, the dog knows that there's five treats in your hand, right? We're not dumb. If there can be one success, there can be more. But if there's zero successes, that's powerful. Jen Stirrup (00:51:25): Yeah. And I don't know if you see this problem, but it's something I see a lot is people think maybe Tableau or Power BI, they buy this, it's going to give them a success. And it does, until the data starts to get hard. And then they either have to scale up in DAX, which is fine, but sometimes they don't have room or bandwidth to do that, so they get almost a bit depleted because they realize, actually, data's hard. We've never really nailed data as the human race. Rob Collie (00:51:55): It's always hard. Unfortunately, to sell software, to a certain extent, you have to sell the lie. If you're a software vendor, you have to se... Rob Collie (00:52:03): ... have to sell the lie. If you're a software vendor, you have to sell the lie that this tool is the magic fix, that it's going to make data easy. And I do actually, in a weird way, I kind of like blame Tableau for making this worse, but while at the same time, being very grateful to Tableau that they made interactivity a must have. Jen Stirrup (00:52:24): Yes. Rob Collie (00:52:24): I think they were actually, more than any one entity, responsible for us breaking this notion that reporting services and similar tools were it. Jen Stirrup (00:52:34): Yes. I remember the first time I saw Tableau. I had been hired as a developer for SQL server [inaudible 00:52:40] services and my boss said, "I think this is a future, this stuff, Tableau. Here's the download link. Tell me what you think". 10 minutes I was completely hooked and it changed my career because otherwise I would have probably stayed in the database reporting world and I suddenly thought there's a whole world here with stuff. So I love what they did. I really, really think it was groundbreaking. Thomas Larock (00:53:01): At what point did a report just become synonymous with the word "Tableau"? I have a limited experience and maybe it's an outlier, but to me, I always hear people say, "I'm going to run a Tableau report". I mean, it's just a report. I worked with Crystal and BusinessObjects, same thing I guess. And do people always qualify the type of report they're running as if that makes it more special or do people always say, "I'm going to run a power BI report"? Why is it always a qualifier? And in my case, I always hear, "I'm going to go run the Tableau report". I'm like, "It's just a report. It doesn't really matter what's the software that's doing it. It's just data. It's just a report". But I hear that a lot. I just figured I'd ask you two if that's the same experience? Jen Stirrup (00:53:43): Yeah. I think I'm hearing that more and more and I actually think it's almost going the other way, where people are only wanting interactivity, they're only wanting things they can click and tick. And what they're not wanting as much is a SQL server, mahogany red, forest green, slate gray, corporate template, because that was the what, about four templates you got with reporting services. So I see that more and more apart from the finance world. They still very much want it. But what I'd still see is a big need for tables. People still want to export to Excel. And I think it was you, Rob, who actually said this years ago, that the third most common button in Tableau is something like "export to CSV". Thomas Larock (00:54:26): Yeah. Rob Collie (00:54:28): Yeah. The third most common button in any data application is "export to Excel". Thomas Larock (00:54:32): Yeah. Rob Collie (00:54:32): Behind "OK" and "Cancel". That's the joke. And what it is, is an acknowledgement of, again, the human plane that this report, this app, does not meet your needs. It's in a way like if you could instrument your organization and find all of the "export to Excel" buttons that are being worn out, those are like the hotspots for you to go and improve things. That button being, click, click, click, click, click, click, click all day long, is telling you that there's a tremendous opportunity for improvement here, both in terms of time saved, but also quality of result. Quality of question that's even formulated. You mentioned questions earlier, asking good questions. Here's the problem. The ability to execute on answers and the inability to execute on answers, the friction, the inertia, that works its way upstream into the question- forming muscles. The question-forming muscles atrophy to a level where they fit the ability to execute on the questions. And so when you suddenly expand the ability to answer questions, it actually... You've got to go back and re-expand your question-asking muscles to be more aggressive, to be more ambitious. Jen Stirrup (00:55:52): Yes. I think sometimes the data-driven piece is trying to, in a way, subtly bring that back into play. It's okay to admit that we don't have all the answers and it's okay to admit that we need to ask questions. I think there should be more of that. Something that, certainly earlier in my career, asking questions was discouraged. It meant you didn't know it. It meant that you were vulnerable in some way. And I think as an industry, we need to encourage people to ask questions. I think with the diversity inclusion piece, try and make a conscious effort. If I think someone in the meeting is being quiet, regardless of the background, but at least I'm trying to watch out for that now, whereas maybe 20 years ago, I wouldn't have realized it, but sometimes people do sometimes need that extra help to speak up and speak out. They often don't know what to say or how to beckon to a meeting and say something. It's quite difficult. Jen Stirrup (00:56:51): Especially if you were being measured in your performance. I think sometimes people see things very confidently. And actually when you start to pick it apart, you think, "I need to as a person, stop believe in confidence and maybe thinking is that right, not how it's being delivered". I think they're stolen for quiet voices, hopefully like mine, who are trying to say things but I do find that harder to get heard. I think it's good that you do podcasts like this because I think it gives people the opportunity to talk about different ideas and how they impact people because that is important. There's loads of vendor podcasts that will talk all about the technology but we need to know better how to apply it. Rob Collie (00:57:31): When we were talking about starting this show, it was pretty clear we did not need another tech show. People who are working in tech, but are human beings, like yourself, and who are focused on helping other human beings. We weren't sure if it was going to work. It was one of those like, "Are people are going to listen?". Thomas Larock (00:57:45): We're still not sure. Rob Collie (00:57:50): We knew that we were going to like it, but yeah, it's building an audience. I've enjoyed it. And plus, it's an excuse to get together and talk with people such as yourself. If we just pinged you out of the blue and said, "Hey, you want to get on a two hour Zoom call with us and just catch up?". That's going to get pushed and pushed and pushed and pushed, but, "Oh a podcast? Oh, well, yeah. That's exciting". Jen Stirrup (00:58:14): Yeah. I know what you mean. It's good to, I think, to try and translate data and technology into something people feel is within their reach because I think there is still an element of people who are almost being scared of working with data. I deal a lot with CTO's, CIO. I was busy CTO and some way reports sent to their CFO because their CFO is over all of it, keeping costs down. The CTO has to work really hard to justify them. And I think what they want, ultimately, is not to appear stupid or not to know what they're doing. So some of these leadership conversations I have are about people saying, "Explain these terms to me. I don't know what a data lakehouse is. Do I need one? How's it different from a data lake? What about the warehouse? Is that going away or is that rebranded as well?". I know Microsoft talked about data hubs recently. If you're a data vault person, a data hub means something quite specific. It's been a term around for 30 years to mean something else. But I think sometimes people get very confused with the terms. Rob Collie (00:59:16): Like for example, the noun "dashboard" in Power BI, right? It's just a head clutching frustrating mistake. I mean a Power BI report is probably best described as a dashboard. The multi-visual, interactive experience, lowercase D dashboard is what I always want to describe it as, but no, no, no, no. We repurposed that word. Jen Stirrup (00:59:41): I know, and customers don't always understand it because they say, "Well, actually my report looks exactly like the dashboard. So I don't understand this publishing thing". So I have to try and explain that actually, we can take data from [inaudible 00:59:55] here and you can extra things. I'm interested to know actually, how much Power BI users spend actually making dashboards as opposed to making reports. And I just wish we'd ever the answer to that because sometimes you just want to get reports that they can run in their desktop or not always sometimes use a browser and just have the reports and have them open on the actual dashboards higher up. So I feel that's a bit of a separation that maybe wasn't required to have. But Tableau does something similar, doesn't it in a way? But I think with Tableau, it's a bit more clear that you're putting these things together. Rob Collie (01:00:29): Well, we were talking at the beginning about the importance of comprehensive training sets. Well, let me just tell you, we only need one data point here. I, as a Power BI user, have never once created an actual Power BI dashboard. So let's just conclude that that's it. No one uses them. But yeah, I've never felt compelled to need one. I tend to put together, what I need in the report. Jen Stirrup (01:00:56): Yes. And that's what I do because I'm trying to get the customer from A to B. I'm trying to do it quickly and I can see that they've reached on that tool ceiling of where they want to go and then they've got this other thing they need to do and they don't understand why. So sometimes it's a battle I just don't have because I just think, "You know what? These often been through so much to get to that point in the first place, cleaning data and getting access to the data and all the things that are hard and even understanding what they want in the first place". I try and work out where the fatigue is. Rob Collie (01:01:28): Yeah. I think there's a certain hubris just in the idea that a user will go around and then harvest little chunks out of other reports and take them completely out of context. Anyway, we didn't come here for cynicism today but- Jen Stirrup (01:01:43): I have plenty of that. Rob Collie (01:01:43): But it's still there. We can't really help it. So it's come up a few times and I want to make sure we actually make some time to talk about it specifically. So you've mentioned a number of times, inclusion and diversity and already a few anecdotes within your own professional organization, within your own firm. Outside of your own data relish organization, what are you up to in this space around the diversity and inclusion as a cause? You're very active in the community in this regard. Can you summarize for us what all you're up to? Jen Stirrup (01:02:15): Yeah. I've started there to talk more about intersectiona
Afua Bruce's career in data science and technology is dotted with a bunch of impressive acronyms: IBM, the FBI (!), even working for the POTUS (Obama) as Executive Director of the National Science and Technology Council. Now, she's taken on a new challenge as Chief Program Officer for Datakind, a global non-profit that harnesses the power of data science and AI in the service of humanity. Heck, she even has her own statue as part of the #IfThenSheCan women in STEM initiative (https://ifthenshecan.org). Tune in to hear an amazing story of a truly unique life path. Learn more about your ad-choices at https://www.iheartpodcastnetwork.com
Afua Bruce's career in data science and technology is dotted with a bunch of impressive acronyms: IBM, the FBI (!), even working for the POTUS (Obama) as Executive Director of the National Science and Technology Council. Now, she's taken on a new challenge as Chief Program Officer for Datakind, a global non-profit that harnesses the power of data science and AI in the service of humanity. Heck, she even has her own statue as part of the #IfThenSheCan women in STEM initiative (https://ifthenshecan.org). Tune in to hear an amazing story of a truly unique life path. Learn more about your ad-choices at https://www.iheartpodcastnetwork.com
Afua Bruce's career in data science and technology is dotted with a bunch of impressive acronyms: IBM, the FBI (!), even working for the POTUS (Obama) as Executive Director of the National Science and Technology Council. Now, she's taken on a new challenge as Chief Program Officer for Datakind, a global non-profit that harnesses the power of data science and AI in the service of humanity. Heck, she even has her own statue as part of the #IfThenSheCan women in STEM initiative (https://ifthenshecan.org). Tune in to hear an amazing story of a truly unique life path. Learn more about your ad-choices at https://www.iheartpodcastnetwork.com
In the second half of my conversation with Ben Kinsella, you'll learn about his experience transitioning from a PhD in Spanish linguistics to data science for humanitarian causes. Enjoy! ----------Benjamin Kinsella, PhD, is a project manager and researcher working at the intersection between data science and human centered design. After completing his PhD in Spanish linguistics at Rutgers University, Benjamin now works at DataKind, a global nonprofit that harnesses the power of data science in the service of humanity, bringing together interdisciplinary approaches to help solve humanitarian issues.
Benjamin Kinsella, PhD, is a project manager and researcher working at the intersection between data science and human centered design. After completing his PhD in Spanish linguistics at Rutgers University, Benjamin now works at DataKind, a global nonprofit that harnesses the power of data science in the service of humanity, bringing together interdisciplinary approaches to help solve humanitarian issues.
Hannah Underwood is a self-confessed charity and data geek with an unquenchable desire to improve as many young lives as possible. In her early career Hannah learned how to measure performance and use data to improve organisations. At the age of 25, Hannah became CEO of The Key, a charity inspiring belief in young people. Working with venture philanthropists, Impetus PEF, they used data & metrics to build a vibrant, robust and well respected organisation. Outside of running The Key, Hannah's been a Director of Datakind UK since 2013 and became part of the ‘Data For Good' movement. When she's not trying to change the world, you'll find Hannah chasing a mucky toddler or renovating their old farmhouse in Northumberland. --- Send in a voice message: https://anchor.fm/mark-longbottom2/message
The following is a conversation between Jake Porway, Founder and Executive Director of DataKind, and Denver Frederick, the host of the Business of Giving. In this interview, Jake Porway, Founder and Executive Director of DataKind, shares the following: • The Role of Data Science during Covid-19 • Challenge of data digitization • How data science is weaponized • To get anti-racist algorithms, you need anti-racist systems and leadership
This week we welcome non-governmental organization leader and former U.S. Army intelligence analyst Cassy L. Cox. Cassy transitioned from active service to a role in the private sector as a corporate recruiter, first with a small start-up headhunting firm and then with a large accounting firm. Feeling a bit unfulfilled, she decided to take an 80% pay cut to join an American international non-profit, called the International Rescue Committee (IRC), whose work began with and remains focused on refugees and displaced persons. Between 2009 and 2015, Cassy worked for the IRC on refugee and IDP programming in the United States, Liberia, Kenya and South Sudan. Cassy then transitioned to be the Programmes Director in Somalia for Concern Worldwide, an Irish international non-profit organization that works to reach the most vulnerable populations across the globe. In this role, Cassy covered all humanitarian sectors including education, cash assistance, women's empowerment, and internally displaced persons. Cassy is currently the Strategy Lead for DataKind, a non-profit organization based in New York City, that harnesses the power of the world's most talented volunteer data scientists to solve some of the world's largest problems. In this episode, Lindsey and Cassy discuss finding meaning and purpose in one's career, the non-governmental organization industry as a career option, and some of the challenges Cassy faced as a transitioning female veteran. ABOUT US Welcome to the FourBlock Podcast, a show that examines veteran career transition and the military-civilian divide in the workplace. General Charles Krulak coined the term "Three Block War" to describe the nature of 21st-century military service defined by peace-keeping, humanitarian aid, and full combat. But what happens next? Veterans are often unprepared to return home and begin new careers. We call this the Fourth Block. FourBlock is a national non-profit that has supported thousands of transitioning service members across the nation in beginning new and meaningful careers. Mike Abrams (@fourblock) is an Afghanistan veteran, FourBlock founder, director of the Columbia University Center for Veteran Transition and Integration, and author of two military transition books. He'll be representing the military transition perspective. Lindsey Pollak (@lindsaypollak) is a career and workplace expert and New York Times bestselling author of three career advice books. Lindsey will be representing the civilian perspective of this issue. Veterans, explore new industries and make the right connections. Find a career that fits your calling. Join us at fourblock.org/ Over 77% of FourBlock alumni stay at their first jobs over 12 months. Sponsor our program or host a class to equip more of our veterans at fourblock.org/donate. Follow FourBlock on Social Media LinkedIn Facebook Instagram Twitter
Drew Conway is a world-renowned data scientist, entrepreneur, author, and speaker, perhaps most well-known for his infamous 2010 “Data Science Venn Diagram”. Today, Drew is the Founder & CEO of Alluvium: a company using machine learning and AI to turn massive streams of data produced by industrial operations companies into insights that bridge the gap between big data and human expertise. Designed with the goal of helping industrial operations become safer, more efficient and more profitable, the Alluvium platform makes industrial machine data meaningful and useful to the people who rely on it to make decisions that affect the stability of their operation. Before starting Alluvium in 2015, Drew helped start: Data Gotham: an organization focused on supporting the NYC data community, with an annual conference bringing together people from all industries DataKind: a non-profit that brings high-impact organizations together with leading data scientists to use data science in the service of humanity. They enable data scientists and social changemakers to address tough humanitarian challenges together, ranging from education to poverty, health to human rights, and the environment to cities. After starting the conversation by exploring Drew’s early years, we focused most of the dialogue around his (quite frankly, brilliant) thought process around identifying the highest-impact, most-needed applications for data science across problem spaces. Some of my favorite talking points included: Why “force of will” and a “tendency toward combativeness” were key to Drew’s early development and overcoming imposter syndrome Lessons learned from his 4th grade teacher who told him he was bad at math and an AAU basketball coach who made his team find their way home from the outskirts of Las Vegas The questions Drew asks executives who tell him they want to hire a data science team, how he recommends they avoid being “seduced by the industry” and “return back to first principles” Drew’s process for determining new applications for data science within various industries The three-question mental model Drew used to identify Alluvium’s first major product offering: business problem → data available → human support Alluvium’s team-building and hiring philosophy, how it’s evolved from day one until today The story behind DataKind, how he and his team decided what nonprofits to start by working with, and the step-by-step process they took to testing their assumptions Enjoy the show! Show Notes: https://ajgoldstein.com/podcast/ep14/ Drew’s LinkedIn: https://www.linkedin.com/in/drew-conway-13b5b013/ AJ’s Twitter: https://twitter.com/ajgoldstein393/
Gianfranco is an independent data scientist and consultant in London, UK. Curiosity has been the main driver of how he developed his career since leaving the Politecnico of Milan in Italy, pushing him to spot technology trends, study them and put them to the test. He was a web developer before Yahoo! and Google were founded; a sysops for large Internet Service Providers before broadband was even available in UK homes; a digital transformation management consultant with Deloitte 10 years before the first iPhone was launched, and he run the biggest official Olympics website in history for London 2012 by the time smartphones had become the main digital platform of choice in everybody's pockets. When he's not at work, he volunteers for both DataKind and MyData.
Jake Porway, Founder of DataKind, joins hosts Nick Ashburn and Sherryl Kuhlman to discuss how he's bringing high-impact organizations together with leading data scientists to use data science in the service of humanity on Dollars and Change. See acast.com/privacy for privacy and opt-out information.
In times of extreme upheaval, why do some people, communities, companies and systems thrive, while others fall apart? Andrew Stolli answers that question and more. Andrew is the author of the best-selling book “Resilience, Why Things Bounce Back” published by Simon and Schuster in the U.S., and in many other languages and territories around the world. The books is his research on the dynamics of resilience in many contexts, people, systems, communities, and companies. Resilience forces us to take the possibility even necessity of failure seriously - Andrew Zolli On this podcast we talk about Why he wrote a book about resilience How organizations and people bounce back Social media and resilience, does it help or hurt Society and whether social safety nets make us more fragile The impact of faith on resilience - it’s not what you think What he learned about organizational resilience in New Orleans in the aftermath of Hurricane Katrina and much more I would love to follow Andrew around for a few days and as he is involved in a lot of very interesting projects. For 11 years, he was the creative force behind PopTech, a renowned innovation and social change network. He served on the board of the Garrison Institute and Blurb. He also serves as an advisor to PlanetLabs (a revolutionary Earth-imaging company), DataKind, which is bringing data science to the social sector, and The Workshop School, an experiment in what a public high school can be. He served as a Fellow of the National Geographic Society. He advises governmental organization, startups, cultural and civil society groups including leadership teams at companies like GE, Nike and Facebook. Show Notes and Resources [00:07:57] Where we discuss the forces that likely put Trump in office [00:13:53] Resilience and the inverted ∩ [00:22:05] Resilience as a skill [00:23:33] An example of organizational resilience in New Orleans after Hurricane Katrina [00:32:59] What makes organizations resilient [00:32:59] Middle management and organizational resilience [00:34:20] Social programs and resilience [00:43:40] The 10 factors that encourage personal resilience [00:44:35] I ask if Facebook making us stronger or weaker [00:50:52] Intentionality and emotional control
Suchana Seth speaks about different definitions of fairness in the context of machine learning. Suchana Seth is a physicist-turned-data scientist from India. She has built scalable data science solutions for startups and industry research labs, and holds patents in text mining and natural language processing. Suchana believes in the power of data to drive positive change, volunteers with DataKind, mentors data-for-good projects, and advises research on IoT ethics. She is also passionate about closing the gender gap in data science, and leads data science workshops with organizations like Women Who Code.
Data has the potential to help fuel social change across the world, yet many relevant datasets remain locked away and siloed across government agencies, nonprofits, and corporations. What kind of collaboration does it take to make this data available to different actors working to create change? In a series of TED-style talks, Melinda Rolfs of the MasterCard Center for Inclusive Growth, John Wilbanks of Sage Bionetworks, Greg Bloom of Civic Hall Labs and Open Referral, and ST Mayer of Code for America talk about how to develop not only the right tools, but also the right relationships to make data collaboration happen. Jake Porway of DataKind then leads a discussion on how we can collectively harness data for the greater good. View the slides from this session here. https://ssir.org/podcasts/entry/unlocking_data_and_unleashing_its_potential
In a time of profound and sustained disruption and volatility, organizations need greater agility, innovation, and creativity than ever before. In this talk from our 2015 Nonprofit Management Institute, Andrew Zolli provides a big-picture view of critical trends and forces of change that will shape the decade to come. He discusses the biases that limit our understanding and explores new ways that organizations can create more resilient organizational strategies and cultures. Zolli is the co-author of Resilience: Why Things Bounce Back and the former director of the innovation and social change network PopTech. He serves as an advisor to organizations including DataKind and The Workshop School. https://ssir.org/podcasts/entry/thriving_in_an_age_of_volatility
In the opening keynote at SSIR‘s February 2016 Data on Purpose conference, Jake Porway shares best practices for data storytellers and shows why knowing what the data is or is not saying is critical to creating ethical and accurate visualizations. Among other things, he explains the pitfalls of pie charts, why you should be wary of word clouds, and why good data storytelling ultimately means good statistics. He also argues that the real power of data storytelling lies not just in reporting on past activity, but in making decisions that drive decision-making in the future. Porway is the founder and executive director of DataKind, a nonprofit that uses data science in the service of humanity. He previously worked at the New York Times R&D Lab, Google, and Bell Labs, and has spoken at IBM, Microsoft, and the White House. He holds a bachelor’s degree in computer science from Columbia University and a master’s degree and a doctorate in statistics from the University of California, Los Angeles. If desired, you can follow along with the slideshow that accompanied Porway’s presentation here. https://ssir.org/podcasts/entry/podcast_practice_safe_stats_a_psa
This week Noelle Sio Saldana discusses her volunteer work at Crisis Text Line - a 24/7 service that connects anyone with crisis counselors. In the episode we discuss Noelle's career and how, as a participant in the Pivotal for Good program (a partnership with DataKind), she spent three months helping find insights in the messaging data collected by Crisis Text Line. These insights helped give visibility into a number of different aspects of Crisis Text Line's services. Listen to this episode to find out how! If you or someone you know is in a moment of crisis, there's someone ready to talk to you by texting the shortcode 741741.
Jake Porway, executive director of DataKind, explains how NGOs use data, how they can use it better, and what data science can do for civil society. Speakers: Elizabeth Eagen, Jake Porway. (Recorded: May 07, 2014)
Episode 4 In this episode Bob & Jay talk with Kymberlee Price @kym_possible about her work with vulnerability data at BlackBerry and her real-life superheroic philanthropic work. Resources / people featured in the episode: One Spark Foundation - https://www.facebook.com/onesparkcanstartafire [FB] Beading Divas (Greyhound and general animal welfare advocates) Help Aidan Love Fight Cancer Project Genesis (advocacy and support for victims of human trafficking, Seattle has the third highest rate of underage sex trafficking in the US) Homeless shelters - no specific link - I mentioned the Seattle Tent City, but there are countless organizations in local communities worldwide that can use your help to prevent homelessness, and help those who are homeless. Spots & Stripes Exotic Cat Sanctuary - https://www.facebook.com/spotsandstripesbengalcatrescue [FB] Hackers for Charity Johnny is such an amazing guy, I'm honored to call him my friend. He would tell you he isn't a superhero either. That is one of the things I love about all my inspirational friends. None of them do this for their ego or to promote their self image/social standing. They do it because they believe it is the right thing to do, and it makes them feel good to know they have made a difference for another person (or animal) DataKind
Digital Preservation 2013 Speaker: Hillary Mason chief scientist at bitly, co-founder of HackNY, creator of dataists, and member of NYCResistor, opened Digital Preservation 2013 with her keynote talk on the delicacies of data. Hilary Mason is the chief scientist at bitly, a company that studies attention on the internet in realtime, doing a mix of research, exploration, and engineering. Mason co-founded HackNY, a non-profit that helps talented engineering students find their way into the startup community of creative technologists in New York City. She is also an advisor to a few organizations, including knod.es, collective, and DataKind, as well as a mentor to Betaspring, the Providence, Rhode Island-based startup accelerator, and TechStars New York. She’s a member of Mayor Bloomberg’s Technology and Innovation Advisory Council, which has been a fascinating way to learn how government and industry can work together. For more information visit http://www.digitalpreservation.gov/multimedia/videos/hillary-mason.html&loclr=itb
Bringing Art and Technology Together - Inspire. Create. Evolve.
batt_001_hilary.mp3 batt_001_hilary.oggHilary Mason lives in New York City where she is the Chief Scientist at bitly. She is trying to bring into popularity the field of Data Science. We also discuss her involvement with HackNY, NYCResistor, and her app to find the Median Hamburger in the West Village. Mentioned in this podcast: Read about Hilary's Burger App Watch a Video of Hilary's talk at Urban ReThink Read updates from Hilary's team on the Bitly Blog 4th of July Recipes on Bitly Inspiration: Jer Thorp, Data Artist in Residence at the NYTimes Jake Porway of DataKind littleBits (on CNN) MakerBot Adafruit Industries FamiLAB The Lean Startup Music: Soldiers of Speccy, Intermission by PILL Follow us: Hilary Mason @hmason Bootstrapping Green @peregrineneel Ryan Price @liberatr
As Artificial Intelligence evolves and the debate on Human Centered AI heats up, we should realize that Big Tech cannot solve community challenges but instead develop impact practices that add human value. Human Centered Design is critical as we look for ways of making AI work for us and raise societal standards in equal measure. What about ethical and explainable AI alongside responsible corporate data management?For Instance, Machine learning and analytics are becoming applicable: From reducing gender bias at Apple’s case and facial recognition at the Amazon go store. The big question here is: How can we improve data labeling, make accurate algorithms and develop accountability measures for AI? Humans should be considered in the ethical AI debate by finding better ways to make technology reduce system inequality and develop common ground on AI.Listen in, as I discuss constructive use of AI in creating positive social outcomes and impact practices creating equitable outcomes in the United States.In this episode: Jake Porway, Founder and Executive Director at DataKindToday's episode is brought to you by Manning Publications. Receive 40% off all Data Science and AI Textbooks at: https://deals.manning.com/podhumain19/.You can support the HumAIn podcast and receive subscriber-only content at http://humainpodcast.com/newsletter.