Podcasts about data provenance

  • 15PODCASTS
  • 19EPISODES
  • 37mAVG DURATION
  • 1MONTHLY NEW EPISODE
  • Nov 12, 2024LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about data provenance

Latest podcast episodes about data provenance

Evaluating Biopharma
Episode: 23 - An Industry Veteran's Insight on Data Governance and Provenance

Evaluating Biopharma

Play Episode Listen Later Nov 12, 2024 16:49


In this episode of Evaluating Biopharma, host Ben Locwin and Scott Endicott, executive leader of Healthcare Solutions Integration, talk about data governance and data provenance—why they matter and how the industry can achieve them. With 30 years of experience, Endicott has a deep understanding of training AI and ML models, particularly when patient data is included, and offers advice on how to navigate the challenges of data governance and provenance in the near-term future. Links from this episode:  Evaluating Biopharma  Black Diamond Networks  Evaluating Biopharma boiler: Evaluating Biopharma taps into the insight and experience of biopharmaceutical leaders so today's decision makers can leverage their knowledge, learn from their successes, and avoid repeating similar mistakes. This series offers a new guest and moderator with each episode and aims to equip executives and science leaders with the necessary information to make better business and process decisions. The Evaluating Biopharma podcast is a reproduction of content originally presented at recent Evaluating Biopharma digital and educational networking events.

Data Transforming Business
The Data Trail: Why Provenance Matters for Your Business

Data Transforming Business

Play Episode Listen Later Sep 4, 2024 19:03


Data provenance is essential for maintaining trust and integrity in data management. It involves tracking the origin of data and understanding how it has been processed and handled over time. By focusing on fundamental principles such as identity, timestamps, and the content of the data, organisations can ensure that their data remains accurate, consistent, and reliable.Implementing data provenance does not require significant changes or large investments. Existing technologies and techniques can be seamlessly integrated to provide greater transparency and control over data. With data provenance, businesses can confidently manage their data, enhancing decision-making and fostering stakeholder trust.In this episode, Jon Geater, Co-Chair of the Supply Chain Integrity Transparency and Trust (SCITT) Working Group, speaks to Paulina Rios Maya, Head of Industry Relations, about data provenance. Key Takeaways: Data provenance is knowing where data comes from and how it has been handled, ensuring trust and integrity.The fundamental principles of data provenance include identity, timestamps, and the content of the data.Data provenance can be implemented by integrating existing technologies and techniques without significant changes or investments.Data provenance helps with compliance, such as GDPR, by providing a transparent record of data handling and demonstrating compliance with requests.Chapters: 00:00 - Introduction and Background02:01 - Understanding Data Provenance05:47 - Implementing Data Provenance10:01 - Data Provenance and Compliance13:50 - Success Stories and Industry Applications18:10 - Conclusion and Call to Action

Notes To My (Legal) Self
Season 6, Episode 20: Unveiling the Future of Data Integrity: Exploring the Data Provenance Standards (with Kassi Burns)

Notes To My (Legal) Self

Play Episode Listen Later Dec 30, 2023 40:46


Kassi Burns is an attorney who has been using AI and Machine Learning for 10 years. Her legal career is defined by a continued curiosity in emerging technologies, technical and client services skills, understanding of data's impact to legal issues, and a desire to engage with others in and beyond the eDiscovery community. Over the years she has built a unique and diverse background in transactional law, eDiscovery, privacy law, data breach response, information law, and legal operations and regular collaborations with Partner/GC/C-suite clients. Her curiosity about emerging technologies, such as blockchain, NFTs and web3, and their impact to the legal profession, puts her on a continued path of constant discovery and educational growth. Her social media presence, including content through @eDiscoverist, is motivated by a desire to teach non-practitioners about eDiscovery and the legal community about web3. In this episode we explore the Data Provenance Standards by the Data & Trust Alliance!nLearn how these standards, akin to food safety labels, promise greater transparency in data's journey from source to application. Discover how integrating these standards can address AI adoption challenges. Uncover the potential for these standards to revolutionize sectors from healthcare to finance, ensuring data integrity and fostering trust in AI systems.

Department of Statistics
(Not) Aggregating Data: The Corcoran Memorial Lecture

Department of Statistics

Play Episode Listen Later Feb 5, 2021 61:47


Professor Kerrie Mengersen, Distinguished Professor of Statistics at Queensland University of Technology in the Science and Engineering Faculty, gives the The Corcoran Memorial Lecture, held on 21st January 2021. Abstract: The ability to generate, access and combine multiple sources of data presents both opportunity and challenge for statistical science. An exemplar phenomenon is the charge to collate all relevant data for the purposes of comprehensive control and analysis. However, this ambition is often thwarted by the relentless expansion in volume of data, as well as issues of data provenance, privacy and governance. Alternatives to creating 'the one database to rule them all' are emerging. An appealing approach is the concept of federated learning, also known as distributed analysis, which aims to analyse disparate datasets in situ. In this presentation, I will discuss some case studies that have motivated our interest in federated learning, review the statistical and computational issues involved in the development of such an approach, and outline our recent efforts to understand and implement a federated learning model in the context of the Australian Cancer Atlas.

PSA Today
PSA Today #17: Kaliya and Seth talk with Jim Fournier of Tru.net about data provenance, self-validating contracts and new models for surfacing social truths

PSA Today

Play Episode Listen Later Sep 9, 2020 48:43


Kaliya welcomes her long-time colleague and frequent collaborator Jim to talk about his history at the front-line of identity management through today. From the Tru.net site: "In many domains, starting with the COVID-19 crisis, the world literally needs a new way of “thinking together”, a new way of collaboratively engaging in a deliberative process, where content can be traced to its source, information has real accountability, and we can choose what to disseminate across society-at-large. As the realization of the need for this burst into the mainstream, it became clear that a general network solution is needed by all organizations, not just for climate, and Tru was born.

Tcast
Data Provenance - Going back to Source

Tcast

Play Episode Listen Later Aug 11, 2020 12:46


To consider the integrity and truth of data to allow for a true life-time analysis of the datum, the provenance or historical lineage of information needs to be had. Data Provenance is the logic back-ordering of data's past to its progenitor/original owner. Going back to the source, for old data, and knowing for a fact its creator going forward makes it infinitely simple and valuable for analysis. Data provenance is synonymous with Source Data. Tcast is an education, business, and technology video podcast that informs listeners and viewers on best practices, theory, technical functions of the TARTLE data marketplace system and how it is designed to serve society with the highest and best intentions. Tcast is brought to you by TARTLE. A global personal data marketplace that allows users to sell their personal information anonymously when they want to, while allowing buyers to access clean ready to analyze data sets on digital identities from all across the globe. The show is hosted by Co-Founder and Source Data Pioneer Alexander McCaig and Head of Conscious Marketing Jason Rigby. What's your data worth? Find out at ( https://tartle.co/ ) Watch the podcast on YouTube ( https://www.youtube.com/channel/UC46qT-wHaRzUZBDTc9uBwJg ) Like our Facebook Page ( https://www.facebook.com/TARTLEofficial/ ) Follow us on Instagram ( https://www.instagram.com/tartle_official/ ) Follow us on Twitter ( https://twitter.com/TARTLEofficial ) Spread the word!

head co founders data going back tcast data provenance
Distributed Data Show
COVID-19 and Data Provenance with Mike Loukides | Ep. 141 Distributed Data Show

Distributed Data Show

Play Episode Listen Later Mar 24, 2020 20:47


One of the hardest areas in getting AI projects into production is operationalizing data. In this episode, Mike Loukides of O'Reilly Media joins Dr. Denise Gosnell and Jeff Carpenter to discuss how data provenance impacts our ability to get the most out of our data, using COVID-19 as an example. See omnystudio.com/listener for privacy information.

Decentralize This!
Ep 15 - Don Gossen - Data Provenance and Building Bridges to Decentralization

Decentralize This!

Play Episode Listen Later Jan 8, 2019 50:23


Hosted by Enigma's Head of Growth Tor Bair, our fifteenth episode features Don Gossen. Don is the co-founder of Ocean Protocol, a decentralized data exchange protocol and one of Enigma’s launch partners. Don has spent his entire career focused on data, analytics, and business intelligence, working in many different countries around the world. He is an expert in how organizations use and consume data — and he has seen first-hand the challenges they face in doing so.  On this episode Don talks with Tor about data provenance and privacy, how to bridge the established centralized world and emerging decentralized systems, how organizations can embrace blockchain-based data solutions, and the critical importance of making the decentralization space more collaborative — and less tribal. Enigma's new podcast "Decentralize This!" features guests from all over the decentralization space: developers, investors, entrepreneurs, researchers, writers, artists, people in government and enterprise - all individuals who care deeply about building a more decentralized and sustainable world. How can all these people with different perspectives collaborate to create and scale the technologies we need to shape a better future? ---- Relevant links: Ocean Protocol: www.oceanprotocol.com Enigma: www.enigma.co 
Enigma Blog: blog.enigma.co
 Enigma Twitter: www.twitter.com/enigmampc

Lay of the Land
Receiving a Package in 2025 - Apurva Chiranewala

Lay of the Land

Play Episode Play 30 sec Highlight Listen Later Nov 20, 2018 22:36


Sendle, Australia's highest rated and fasted growing courier company is changing the delivery landscape. Apurva Chiranewala, Sendle's Head of Strategy and Partnerships, has a vision for the future of deliveries and he shares with us what role he sees proof of location playing in that future. His answers are fascinating. You'll be amazed to learn where shipping is headed. Spoiler alert, it's going to look completely different!www.sendle.comhttps://twitter.com/Apurv_EdSupport the show (https://www.layoftheland.space)

Lay of the Land
Part 2 - PLATIN Interview with Dr. Lionel Wolberger and Allon Mason

Lay of the Land

Play Episode Listen Later Nov 6, 2018 36:36


"Proof of location is a market that really is ripe for disruption and something is about to change dramatically in the way location influences our lives and the way we control our own location data." Allon Mason. - Platin Founders Dr. Lionel Wolberger and Allon Mason walk us through their proof of location protocol Platin. www.platin.iotelegram: https://platin.io/ttwitter: https://platin.io/wmedium: https://platin.io/mIn this jam packed episode we discuss signal harvesting, sensor science, key management, security, usability and more. We also discuss one of my favourite areas of cryptography; zero knowledge proofs. Dr. Wolberger explains the general concept of zero knowledge proofs in a way that makes them easier to understand than ever before.It is a long interview so I've broken it up into two parts. However, I released both episodes at the same time so you can binge-listen or pause for reflection and processing between the episodes. Please subscribe, rate and review the podcast. It helps other listeners find it. Thank you for being here. I hope you enjoy listening to Lay of the Land.Support the show (https://www.layoftheland.space)

Lay of the Land
Part 1 - PLATIN Interview with Dr. Lionel Wolberger and Allon Mason

Lay of the Land

Play Episode Play 30 sec Highlight Listen Later Nov 5, 2018 20:50


"Proof of location is a market that really is ripe for disruption and something is about to change dramatically in the way location influences our lives and the way we control our own location data." Allon Mason. - Platin Founders Dr. Lionel Wolberger and Allon Mason walk us through their proof of location protocol Platin. www.platin.iotelegram: https://platin.io/ttwitter: https://platin.io/wmedium: https://platin.io/mIn this jam packed episode we discuss signal harvesting, sensor science, key management, security, usability and more. We also discuss one of my favourite areas of cryptography; zero knowledge proofs. Dr. Wolberger explains the general concept of zero knowledge proofs in a way that makes them easier to understand than ever before.It is a long interview so I've broken it up into two parts. However, I released both episodes at the same time so you can binge-listen or pause for reflection and processing between the episodes. Please subscribe, rate and review the podcast. It helps other listeners find it. Thank you for being here. I hope you enjoy listening to Lay of the Land.Support the show (https://www.layoftheland.space)

Lay of the Land
Introduction to Lay of the Land

Lay of the Land

Play Episode Listen Later Oct 31, 2018 13:19


Proof of Location is a new field at the bleeding edge of technology and is an important part of the Internet of Things, gaming, supply chains, insurance and many other fields. Your host, Kiersten Jowett, is a Proof of Location Specialist in blockchain technology but this podcast isn't just about blockchain technology. Lay of the Land introduces you to specialists in all areas related to proof of location.What is proof of location? Proof of location is like a mirror. Our digital world is mirroring our physical world and what proof of location is trying to do is build a better, more accurate mirror in the digital world to better represent where things are in the real world. It's an exciting and complex field involving computer science, social science, law, finance, cryptography, mechanism design and software engineering. Join me for each mind blowing episode of Lay of the Land!!Support the show (https://www.layoftheland.space)

Decrypt Asia - Blockchain and Cryptocurrency Podcast
Ep. #18: Data Provenance and Authenticity using Blockchain with Mike Davie (Founder Quadrant Protocol)

Decrypt Asia - Blockchain and Cryptocurrency Podcast

Play Episode Listen Later Jul 15, 2018 26:27


Mike Davie is the Founder of Quadrant Protocol, a Blockchain project that allows for easy access, creation and distribution of data. Mike was a member of the Advanced Mobile Product Strategy Division at Samsung where he developed go-to-market strategies for cutting edge technologies created in the Samsung R&D Labs. Thereafter, he founded DataStreamX, an online marketplace for real time data allowing companies that produce data to easily monetize and distribute it and allowing data consumers to source their data requirements in one place. With the emergence of the Blockchain technology, Mike sought to further improve what he was trying to do at DataStreamX and as a result, Quadrant Protocol was born. Tune in to hear about: Mike’s background and how Quadrant Protocol was born What problems Quadrant Protocol is seeking to address How Quadrant Protocol is a data stamping protocol and the value that brings Different stakeholders in the Quadrant ecosystem and their roles (‘nurseries’ are where the data is born, ‘pioneers’ who create the data into products, ‘elons’ who can be innovative companies and individuals who try to create more value of the data products) How the different stakeholders are incentivised in the ecosystem Quadrant Protocol’s consensus mechanism (mix of Proof of Authority and Proof of Work) Quadrant Protocol’s roadmap Key things Quadrant is looking for - partnerships, data providers and data consumers Major partnership with IMDA (Govt of Singapore) - https://medium.com/quadrantprotocol/singapores-info-communications-media-development-authority-imda-partners-with-datastreamx-and-be4b1ba94cd9 Quadrant Protocol website, Medium

Linear Digressions
Data Provenance

Linear Digressions

Play Episode Listen Later Sep 3, 2017 22:48


Software engineers are familiar with the idea of versioning code, so you can go back later and revive a past state of the system.  For data scientists who might want to reconstruct past models, though, it's not just about keeping the modeling code.  It's also about saving a version of the data that made the model.  There are a lot of other benefits to keeping track of datasets, so in this episode we'll talk about data lineage or data provenance.

Data Skeptic
Data Provenance and Reproducibility with Pachyderm

Data Skeptic

Play Episode Listen Later Feb 3, 2017 40:11


Versioning isn't just for source code. Being able to track changes to data is critical for answering questions about data provenance, quality, and reproducibility. Daniel Whitenack joins me this week to talk about these concepts and share his work on Pachyderm. Pachyderm is an open source containerized data lake. During the show, Daniel mentioned the Gopher Data Science github repo as a great resource for any data scientists interested in the Go language. Although we didn't mention it, Daniel also did an interesting analysis on the 2016 world chess championship that complements our recent episode on chess well. You can find that post here Supplemental music is Lee Rosevere's Let's Start at the Beginning.   Thanks to Periscope Data for sponsoring this episode. More about them at periscopedata.com/skeptics      

supplemental reproducibility versioning pachyderm daniel whitenack periscope data data provenance
Data Skeptic
[MINI] Data Provenance

Data Skeptic

Play Episode Listen Later Jan 8, 2015 10:56


This episode introduces a high level discussion on the topic of Data Provenance, with more MINI episodes to follow to get into specific topics. Thanks to listener Sara L who wrote in to point out the Data Skeptic Podcast has focused alot about using data to be skeptical, but not necessarily being skeptical of data. Data Provenance is the concept of knowing the full origin of your dataset. Where did it come from? Who collected it? How as it collected? Does it combine independent sources or one singular source? What are the error bounds on the way it was measured? These are just some of the questions one should ask to understand their data. After all, if the antecedent of an argument is built on dubious grounds, the consequent of the argument is equally dubious. For a more technical discussion than what we get into in this mini epiosode, I recommend A Survey of Data Provenance Techniques by authors Simmhan, Plale, and Gannon.

data provenance sara l data provenance