POPULARITY
In this episode of Identity at the Center, hosts Jeff Steadman and Jim McDonald are joined by Jerome Thorstenson, IAM Architect with Salling Group, live from EIC 2025 in Berlin! Jerome shares his insights on B2B identity, the challenges of managing access for a complex supply chain, and the importance of an identity-first approach.Discover how Salling Group, operating major labels like Target and Starbucks, handles identity for thousands of employees and external partners. Jerome dives into the complexities of balancing security, user experience, and the practicalities of implementing IGA and ABAC.From navigating the challenges of data quality and high employee turnover to the nuances of transitioning between IGA systems, this episode offers valuable insights for identity practitioners.Chapter Timestamps:00:00:00 - B2B Identity Challenges00:02:14 - Welcome to Identity at the Center from EIC 202500:04:14 - Jerome's Journey into Identity00:05:19 - Salling Group Overview00:06:57 - Securing B2B - Jerome's Presentation00:10:54 - Controlling Access in B2B00:11:41 - Identity as a Product00:14:51 - The Role of the IAM Practitioner00:16:31 - ABAC as a Game Changer00:21:00 - Language Considerations in a European Context00:22:33 - Employee Turnover Challenges00:25:07 - IGA Implementation Insights00:29:28 - Identity Fabric Discussion00:31:21 - Jerome's Caribbean Background00:34:06 - Wrap-up and Contact InformationConnect with Jerome: https://www.linkedin.com/in/jetdk/Connect with us on LinkedIn:Jim McDonald: https://www.linkedin.com/in/jimmcdonaldpmp/Jeff Steadman: https://www.linkedin.com/in/jeffsteadman/Visit the show on the web at http://idacpodcast.comKeywords:IDAC, Identity at the Center, Jeff Steadman, Jim McDonald, EIC 2025, B2B Identity, Identity First Security, IAM, Identity and Access Management, Supply Chain Security, IGA, ABAC, Attribute-Based Access Control, Role-Based Access Control, Identity Fabric, Digital Identity, Cybersecurity, Data Quality, Employee Turnover, Caribbean
Data is the new gold, but only if you can use it wisely. In this insightful interview, Tony from COVER chats with Angus Black of BarnOwl Data Solutions to unpack the explosion of data in the insurance sector—its growing volumes, evolving quality demands, and the pressure on insurers to extract real value. Angus shares practical insights into working with legacy systems, defining and measuring data quality, and how regulatory pressure is reshaping priorities. If you're in insurance, this is a must-listen for understanding how data is becoming your most important asset.
Episode SummaryThis episode dives into the critical role of data quality in unlocking the true potential of AI for ABM success. Daryn Smith shares how organizations can move from AI hype to AI readiness, focusing on bridging data silos and using AI effectively. The discussion explores practical steps to improving data management, leveraging AI tools to streamline processes, and why leadership plays a pivotal role in driving change. Daryn also reveals fascinating use cases where AI has made a tangible difference in business operations and marketing strategies.Key TakeawaysAI Readiness is LaggingOnly 8.5% of companies are truly AI-ready, despite executives highly prioritizing AI adoption.Data Quality is Key:70% of respondents prioritize data quality over AI, revealing that unstructured or siloed data hampers AI effectiveness.Leadership's Role in AI SuccessLeaders must champion AI adoption while being transparent about the need for continuous improvements in data management and AI training.Blind Automation Can Undermine ResultsAI often requires contextual data and meaningful human oversight to produce relevant, impactful recommendations.Practical AI ApplicationsFrom creating AI-driven RFP agents to developing AI tools that emulate senior employees' knowledge, simple yet impactful use cases can revolutionize workflows.Best Moments (01:10) – Daryn's Career Journey : From web developer to CEO, Daryn details how his unique background has shaped his approach to marketing and AI-led transformation.(04:53) – The Growing Importance of AI Readiness : Daryn highlights how data silos and poor data quality hinder AI's potential and shares insights from Hubble Digital's recent research.(10:57) – AI's Role in Revolutionizing ABM : Daryn and Paul discuss how AI-powered tools are automating manual tasks, leading to better account targeting and personalized campaigns.(14:00) – The Disconnect Between Leadership and Reality : A candid discussion on why executives often overestimate their organizations' AI capabilities.(34:06) – AI in Practical Use Cases : Daryn shares how Hubble Digital uses AI agents to streamline RFP responses and retain institutional knowledge.(22:27) – Evolving Data Systems : Tips on keeping CRM and ABM systems agile to adapt to business changes, ensuring they retain their value long term.Tech RecommendationsHubSpot – A platform supporting ABM strategies and AI integration for CRM efficiency.Fathom.ai – A tool for turning unstructured sales and marketing data into actionable insights.Books:Hacking Marketing by Scott Brinker - Agile Practices to Make Marketing Smarter, Faster, and More InnovativeDharmesh Shah - Founder & CTO, HubSpotDave Gerhardt - Founder, Exit Five
In this episode of the Transform Sales Podcast: Sales Software Review Series, Dave Menjura ☁, Marketplace Specialist at CloudTask, interviews Saahil Dhaka, Co-Founder and CEO at Clientell, a customer engagement platform designed to help businesses manage client relationships and enhance communication. The platform targets sales and customer service teams looking to improve client satisfaction and retention through personalized outreach. Saahil shares how Clientell simplifies client communication by automating key processes, reducing admin work by up to 70%, and improving data hygiene. By streamlining workflows and enhancing visibility, Clientell allows teams to focus on more strategic tasks while improving overall operational efficiency. The platform's ability to centralize data and reduce the complexity of tech stacks makes it a powerful solution for businesses seeking to optimize customer engagement and reduce costs. Ideal for sales teams and customer service leaders struggling with manual processes and inefficient systems, Clientell transforms how companies interact with clients, providing a seamless experience that drives retention and growth. Try Clientell here: https://getcloudtask.com/clientell-55dc14 #TransformSales #SalesSoftware #Clientell #CloudTask
Were you in Washington, DC, last week for IIeX? If not, our latest episode of Intellect has you covered! Brian Peterson and Matthew Alexander were in the nation's capital for this year's event and sat down to recap all the trends and topics. Let's dive into what the guys discussed: Data Quality Takes Center Stage One of the most talked-about issues this year was data quality, spurred on by the recent industry indictment. Sessions addressing fraud detection, transparent sampling practices, and respondent experience were standing room only. It's clear that researchers are demanding more transparency and layered approaches to improving data quality, from better survey design, better tools, and even improved respondent experience. AI and Synthetic Data: More Practical Applications and Progress AI continued to dominate the conversation, but the tone has shifted. This year, discussions focused on practical applications of how AI can assist in the research process, with a variety of presenters showing how they had integrated AI into their tools. Synthetic data was also a large topic of conversation, with several presenters showing their latest strides in synthetic data and how they are validating it. Brian and Matthew both agreed that the applications are gaining traction in qualitative research, but based on the presentations, were still cautious about its readiness for quantitative applications. What Worked (and What Didn't) From a Format Perspective Both Brian and Matthew felt Washington, D.C. was a fantastic host city—clean, easy to navigate, and close to the airport. They were also happy that the Greenbook team abandoned the headphone concept that was used in Austin. The only drawback they found was that some sessions suffered from mismatched room sizes, with popular topics overflowing beyond capacity. They speculated that because of the indictment news that broke just a couple of weeks prior, there was increased interest in some topics. Give it a listen and let us know what you think. And hey, if IIeX comes to Cincinnati next year like Brian and Matthew suggested, we'll see you there! Did you miss one of our webinars or want to get some of our whitepapers and reports? You can find it all on our Resources page on our website here. Learn more about your ad choices. Visit megaphone.fm/adchoices
A modern digital healthcare economy is impossible without a robust provider directory, which serves as the foundation for interoperability and crucial processes. In this episode, Erin Weber, Chief Policy and Research Officer, discusses how CAQH supports provider directories, emphasizing the need for data accuracy and standardization through initiatives like universal group roster templates. She highlights the importance of interoperability and maintaining accurate data to ensure seamless care delivery and billing. Don Rucker, Chief Strategy Officer, talks about modern FHIR APIs and interoperability. He uses the analogy of domain name services on the internet and stresses the need for “computable interoperability” where data can be used in real time to improve care. They explain how the 21st Century Cures Act has impacted healthcare and how legacy systems need to be modernized. Don and Erin stress that this work is crucial for modern healthcare to evolve and deliver improved patient experiences. Tune in and learn how these key changes are shaping the future of healthcare! Resources: Connect with and follow Erin Weber on LinkedIn. Follow CAQH on LinkedIn and visit their website. Connect with and follow Don Rucker on LinkedIn. Learn more about 1upHealth on their LinkedIn and website. Check out the latest annual CAQH Index Report here.
What if AI could be the key to solving the concrete industry's biggest problem—labor shortages? In this episode of the Concrete Logic Podcast, Seth Tandett sits down with Ramy Sedra, CEO of C60, to discuss how AI is transforming construction. From boosting productivity to making smarter business decisions, they break down how AI can help, but also why it's not a one-size-fits-all solution. Tune in to find out how data, technology, and strategy come together to reshape the future of concrete. Don't miss this essential conversation for anyone looking to stay ahead in the industry! What You'll Discover in This Episode:
Who said data was just an invisible string of numbers? On this episode of Bringing Data and AI to Life our host and GVP of Solution Specialist Sales at Informatica, Amy Horowitz, is joined by Cathy Hackl, CEO and Founder of Future Dynamics. Cathy is a globally recognized futurist, an expert in deep tech and a speaker on AI. This conversation touches on the definition of data itself, and how it goes beyond what you can calculate in a spreadsheet. Cathy talks us through the challenges businesses face when incorporating AI into their practices, as well as the rise of using physical AI to visualise data, reminding us that the quality of your data is critical to the rate of your growth.
“AI is only as powerful as the data behind it. If you don't trust the inputs, you can't trust the outputs and that's where most companies get stuck. It's not enough to have automation or algorithms; you need quality, transparency, and alignment across your go-to-market motion. That's the difference between tech that looks smart and tech that actually drives revenue.” AI is everywhere but without clean data and strategic alignment, it's just noise. In this episode of Revenue Boost: A Marketing Podcast, titled, Smarter Tech, Sharper Targeting: Fueling Revenue with AI, Data Quality, and GTM Alignment, Demandbase CMO Kelly Hopping joins host Kerry Curran to unpack what it really takes to make AI work for B2B revenue growth. From smarter targeting to scaling with efficiency, Kelly shares how enterprise leaders can leverage AI-powered tools only when grounded in high-quality data and a clearly defined ICP. You'll learn why GTM alignment matters more than ever and how to avoid the pitfalls of disconnected tech stacks and generic automation. If you're building or optimizing your go-to-market engine, this episode is your roadmap to doing it smarter.
Randall Balestriero joins the show to discuss some counterintuitive findings in AI. He shares research showing that huge language models, even when started from scratch (randomly initialized) without massive pre-training, can learn specific tasks like sentiment analysis surprisingly well, train stably, and avoid severe overfitting, sometimes matching the performance of costly pre-trained models. This raises questions about when giant pre-training efforts are truly worth it.He also talks about how self-supervised learning (where models learn from data structure itself) and traditional supervised learning (using labeled data) are fundamentally similar, allowing researchers to apply decades of supervised learning theory to improve newer self-supervised methods.Finally, Randall touches on fairness in AI models used for Earth data (like climate prediction), revealing that these models can be biased, performing poorly in specific locations like islands or coastlines even if they seem accurate overall, which has important implications for policy decisions based on this data.SPONSOR MESSAGES:***Tufa AI Labs is a brand new research lab in Zurich started by Benjamin Crouzier focussed on o-series style reasoning and AGI. They are hiring a Chief Engineer and ML engineers. Events in Zurich. Goto https://tufalabs.ai/***TRANSCRIPT + SHOWNOTES:https://www.dropbox.com/scl/fi/n7yev71nsjso71jyjz1fy/RANDALLNEURIPS.pdf?rlkey=0dn4injp1sc4ts8njwf3wfmxv&dl=0TOC:1. Model Training Efficiency and Scale [00:00:00] 1.1 Training Stability of Large Models on Small Datasets [00:04:09] 1.2 Pre-training vs Random Initialization Performance Comparison [00:07:58] 1.3 Task-Specific Models vs General LLMs Efficiency2. Learning Paradigms and Data Distribution [00:10:35] 2.1 Fair Language Model Paradox and Token Frequency Issues [00:12:02] 2.2 Pre-training vs Single-task Learning Spectrum [00:16:04] 2.3 Theoretical Equivalence of Supervised and Self-supervised Learning [00:19:40] 2.4 Self-Supervised Learning and Supervised Learning Relationships [00:21:25] 2.5 SSL Objectives and Heavy-tailed Data Distribution Challenges3. Geographic Representation in ML Systems [00:25:20] 3.1 Geographic Bias in Earth Data Models and Neural Representations [00:28:10] 3.2 Mathematical Limitations and Model Improvements [00:30:24] 3.3 Data Quality and Geographic Bias in ML DatasetsREFS:[00:01:40] Research on training large language models from scratch on small datasets, Randall Balestriero et al.https://openreview.net/forum?id=wYGBWOjq1Q[00:10:35] The Fair Language Model Paradox (2024), Andrea Pinto, Tomer Galanti, Randall Balestrierohttps://arxiv.org/abs/2410.11985[00:12:20] Muppet: Massive Multi-task Representations with Pre-Finetuning (2021), Armen Aghajanyan et al.https://arxiv.org/abs/2101.11038[00:14:30] Dissociating language and thought in large language models (2023), Kyle Mahowald et al.https://arxiv.org/abs/2301.06627[00:16:05] The Birth of Self-Supervised Learning: A Supervised Theory, Randall Balestriero et al.https://openreview.net/forum?id=NhYAjAAdQT[00:21:25] VICReg: Variance-Invariance-Covariance Regularization for Self-Supervised Learning, Adrien Bardes, Jean Ponce, Yann LeCunhttps://arxiv.org/abs/2105.04906[00:25:20] No Location Left Behind: Measuring and Improving the Fairness of Implicit Representations for Earth Data (2025), Daniel Cai, Randall Balestriero, et al.https://arxiv.org/abs/2502.06831[00:33:45] Mark Ibrahim et al.'s work on geographic bias in computer vision datasets, Mark Ibrahimhttps://arxiv.org/pdf/2304.12210
This week on Alter Everything, we chat with Scott Jones and Treyson Marks from DCG Analytics about the history and misconceptions of AI, the importance of data quality, and how Alteryx can serve as a powerful tool for pre-processing AI data. Topics of this episode include the role of humans in auditing AI outputs and the critical need for curated data to ensure trustworthy results. Through real-world use cases, this episode explores how AI can significantly enhance analytics and decision-making processes in various industries.Panelists: Treyson Marks, Managing Partner @ DCG Analytiocs - LinkedInScott Jones, Principal analytics consultant @ DCG Analytics - LinkedInMegan Bowers, Sr. Content Manager @ Alteryx - @MeganBowers, LinkedInShow notes: DCG Analytics Interested in sharing your feedback with the Alter Everything team? Take our feedback survey here!This episode was produced by Megan Bowers, Mike Cusic, and Matt Rotundo. Special thanks to Andy Uttley for the theme music and Mike Cusic for the for our album artwork.
In this episode of Intellicast, host Brian Peterson is joined by Mary Draper and Aron Wilson for a timely and candid discussion about the state of data quality in market research. The team unpacks the recent indictment involving OP4G and Slice MR, accused of orchestrating a $10 million fraud scheme using fabricated survey data. They explore how this scandal highlights long-standing challenges around fraud detection, insider manipulation, and the limitations of current quality safeguards. The conversation also recaps key insights from the recent Insights Association Ignite Data Quality Conference, including the growing push for sample transparency, the nuanced role of ISO certifications, and the critical need for a respondent-first approach to survey design. With perspectives from both supplier and full-service backgrounds, this episode explores how the industry can work together to raise the bar for quality, combat fraud, and ultimately rebuild trust. Whether you're a panel provider, research buyer, or full-service agency, this episode is packed with insights you don't want to miss. Want a better understanding of EMI data quality processes? Click Here Heading to Washington D.C. for IIeX – be sure to connect with Matthew and Brian! Did you miss one of our webinars or want to get some of our whitepapers and reports? You can find it all on our Resources page on our website here. Learn more about your ad choices. Visit megaphone.fm/adchoices
Is a Data Lakehouse the key to unlocking AI success? I had a great discussion on how Data Lakehouses are shaping the future of AI with Edward Calvesbert, VP, Product Management - watsonx, IBM, on The Ravit Show!We explored the evolving relationship between data quality and AI success, the biggest misconceptions companies have about using their data for AI, and how data architects and engineers need to adapt.We also tackled the challenge of data silos, the explosion of unstructured data, and the crucial balance between security, governance, and scaling AI across the enterprise.AI is only as good as the data behind it—so how do we ensure businesses are using the right data?#watsonx #sponsored #ibm #theravitshow
Prashant Jajodia, Financial Services Sector Leader & Managing Partner, IBM ConsultingIBM reports that just 8% of banks were systematically developing generative AI in 2024 but that this percentage is set to soar in 2025 as businesses move from pilot to execution in pursuit of profits. Artificial intelligence is set to become the core driver of innovation, transforming AI from a tool into the foundation of a bank's business model. Robin Amlôt of IBS Intelligence speaks to Prashant Jajodia, Financial Services Sector Leader and Managing Partner of IBM Consulting.
How can businesses scale data streaming efficiently? I spoke with Mary Elizabeth Cutrali, Head of Product at Buf on The Ravit Show, about how organizations can optimize cloud costs while maintaining high data quality.We discussed what Bufstream is, how it helps manage total cost of ownership, and what makes it different from other Kafka providers. Mary also shared some exciting updates on what's next for Bufstream!If cloud cost optimization and data quality are on your radar, check out the full conversation!#data #ai #streaming #gartnerorlando2025 #theravitshow
"So you want trusted data, but you want it now? Building this trust really starts with transparency and collaboration. It's not just technology. It's about creating a single governed view of data that is consistent no matter who accesses it, " says Errol Rodericks, Director of Product Marketing at Denodo.In this episode of the 'Don't Panic, It's Just Data' podcast, Shawn Rogers, CEO at BARC US, speaks with Errol Rodericks from Denodo. They explore the crucial link between trusted data and successful AI initiatives. They discuss key factors such as data orchestration, governance, and cost management within complex cloud environments. We've all heard the horror stories – AI projects that fail spectacularly, delivering biased or inaccurate results. But what's the root cause of these failures? More often than not, it's a lack of focus on the data itself. Rodericks emphasises that "AI is only as good as the data it's trained on." This episode explores how organisations can avoid the "garbage in, garbage out" scenario by prioritising data quality, lineage, and responsible AI practices. Learn how to avoid AI failures and discover strategies for building an AI-ready data foundation that ensures trusted, reliable outcomes. Key topics include overcoming data bias, ETL processes, and improving data sharing practices.TakeawaysBad data leads to bad AI outputs.Trust in data is essential for effective AI.Organisations must prioritise data quality and orchestration.Transparency and collaboration are key to building trust in data.Compliance is a responsibility for the entire organisation, not just IT.Agility in accessing data is crucial for AI success.Chapters00:00 The Importance of Data Quality in AI02:57 Building Trust in Data Ecosystems06:11 Navigating Complex Data Landscapes09:11 Top-Down Pressure for AI Strategy11:49 Responsible AI and Data Governance15:08 Challenges in Personalisation and Compliance17:47 The Role of Speed in Data Utilisation20:47 Advice for CFOs on AI InvestmentsAbout DenodoDenodo is a leader in data management. The award-winning Denodo Platform is the leading logical data management platform for transforming data into trustworthy insights and outcomes for all data-related initiatives across the enterprise, including AI and self-service. Denodo's customers in all industries all over the world have delivered trusted AI-ready and business-ready data in a third of the time and with 10x better performance than with lakehouses and other mainstream data platforms alone.
Are companies starting to get concerned that AI isn't meeting expectations? Concern is a part of any new technology adoption curve, but let's explore some areas where expectations might not be meeting results. SHOW: 914SHOW TRANSCRIPT: The Cloudcast #914 TranscriptSHOW VIDEO: https://youtube.com/@TheCloudcastNET CLOUD NEWS OF THE WEEK: http://bit.ly/cloudcast-cnotwCHECK OUT OUR NEW PODCAST: "CLOUDCAST BASICS"SHOW SPONSORS:Try Postman AI Agent Builder TodayCut Enterprise IT Support Costs by 30-50% with US CloudSHOW NOTES:Why AI isn't meeting expectations (The Artificial Intelligence Enterprise)IS THREE YEARS ENOUGH TIME FOR ANY TECHNOLOGY TO TAKE OVER THE WORLD?Costs are still high, and positive ROI is still evolvingThe technology stack and standards are still evolvingEnterprise expectations are being confused with consumer expectationsAI predictions and timelines are overly aggressiveHow is anymore measuring AI success? The AI future is already here, it's just unevenly distributed“AI First” strategies are following “Cloud First” strategies - unevenly and distributedMost people don't like to talk about the augment vs. replace issueMiscellaneous stuff:ChatGPT is the fasting growing tech ever AGI will be here at any momentFor the first 18+ months, it was only an OpenAI + NVIDIA marketAll software development will be done by AIThere will be $1B companies with 1 personGenAI, Frontier Models, Open Source Models, Agents, etc.Deep Seek, MCP, etc. FEEDBACK?Email: show at the cloudcast dot netTwitter/X: @cloudcastpodBlueSky: @cloudcastpod.bsky.socialInstagram: @cloudcastpodTikTok: @cloudcastpod
The GeekNarrator memberships can be joined here: https://www.youtube.com/channel/UC_mGuY4g0mggeUGM6V1osdA/joinMembership will get you access to member only videos, exclusive notes and monthly 1:1 with me. Here you can see all the member only videos: https://www.youtube.com/playlist?list=UUMO_mGuY4g0mggeUGM6V1osdA------------------------------------------------------------------------------------------------------------------------------------------------------------------About this episode: ------------------------------------------------------------------------------------------------------------------------------------------------------------------In this conversation, Jacopo and Ciro discuss their journey in building Bauplan, a platform designed to simplify data management and enhance developer experience. They explore the challenges faced in data bottlenecks, the integration of development and production environments, and the unique approach of Bauplan using serverless functions and Git-like versioning for data. The discussion also touches on scalability, handling large data workloads, and the critical aspects of reproducibility and compliance in data management. Chapters:00:00 Introduction03:00 The Data Bottleneck: Challenges in Data Management06:14 Bridging Development and Production: The Need for Integration09:06 Serverless Functions and Git for Data17:03 Developer Experience: Reducing Complexity in Data Management19:45 The Role of Functions in Data Pipelines: A New Paradigm23:40 Building Robust Data Solutions: Versioning and Parameters30:13 Optimizing Data Processing: Bauplan Runtime46:46 Understanding Control Planes and Data Management48:51 Ensuring Robustness in Data Pipelines52:38 Data Quality and Testing Mechanisms54:43 Branching and Collaboration in Data Development57:09 Scalability and Resource Management in Data Functions01:01:13 Handling Large Data Workloads and Use Cases01:09:05 Reproducibility and Compliance in Data Management01:16:46 Future Directions in Data Engineering and Use CasesLinks and References:Bauplan website:https://www.bauplanlabs.com
Guest: Viraj Narayanan, CEO of Cornerstone AI
Despite $180 billion spent on big data tools and technologies, poor data quality remains a significant barrier for businesses, especially in achieving Generative AI goals. Published at: https://www.eckerson.com/articles/poor-data-quality-is-a-full-blown-crisis-a-2024-customer-insight-report
A professor, an analytics director, and a podcaster walk into a bar and order a whiskey. Which brand do they order? And how does that data make it's way into a marketing mix model?That's what Simon and Jim wanted to know, so they asked Elea Feit - Associate Dean of Research and Professor of Marketing at Drexel, and Karen Chisholm, Director of Transformation Analytics at Pernod Ricard.Find out the biggest challenge marketers are facing today regarding measurement, and how they're tackling it. Find out what's in store for marketing measurement and MMM in the next 3 years.Grab a drink and have a listen :)▶️ Watch on YouTubeLinks from the show:Marketing Science InstituteThe Advertising Research FoundationElea Feit on LinkedIneleafeit.comKaren Chisholm (email about job opportunities!)00:47 Today's Topic: Marketing Mix Modeling02:10 Introducing the Guests05:07 MSI and ARF Initiative07:52 Survey Insights and Challenges12:20 Measurement Techniques and Strategies16:34 Brand-Level Optimization and Earned Media22:44 Granularity in Marketing Mix Modeling28:54 Understanding Marketing Mix Modeling29:18 The Four Ps and Their Importance30:20 Media Mix Modeling vs. Marketing Mix Modeling32:29 Challenges in Media and Marketing Mix Modeling34:09 Always-On Discounts and Their Impact37:14 Data Quality and Availability Issues40:38 The Future of Marketing Mix Modeling43:08 Industry Perspectives and Best Practices50:28 Open Source Solutions and In-House Modeling54:58 Job Opportunities and Final Thoughts
What if the future of innovation isn't about choosing between human intelligence and artificial intelligence, but about them working together seamlessly?In this episode, Sarah Schlobohm, Chief AI Officer at Zally®, shares her unique view on how AI is transforming the world.She takes us through her journey from studying particle physics to leading AI strategy, showing how technical skills can drive business progress. Sarah breaks down the complexities of AI, discussing the fine balance between technological growth and ethical responsibility.Drawing on her wide-ranging experience, she explores how AI is reshaping industries from healthcare to archaeology while stressing the importance of human-AI collaboration. She highlights the need for responsible AI development and the potential of new technologies to tackle big challenges.A fascinating look at the future of AI, combining scientific expertise with strategic thinking and a passion for innovation.0:00 Sarah' Journey and Role at Zally® 4:49 The Role of AI in Business and Industry Disruption 8:49 Leadership and Communication in AI 13:33 Challenges and Opportunities in AI Implementation 20:10 The Future of AI and Ethical Considerations 20:30 The Role of Data Quality in AI Success 31:54 The Impact of AI on Business and Society 32:18 The Role of Human and Machine Collaboration 36:19 The Importance of Networking and Professional Development 38:39 The Role of Data in Business StrategyFor more information head to https://www.miraitalent.com/Welcome to “Let's Talk Data”, a deep dive into the transformative realm of data with the trailblazers of this disruptive industry.Your host Emma Crabtree explores the latest trends and developments in the data sector, while also delving into the personal stories of these data pioneers.They reveal how they harness data, not just in business, but in shaping their everyday lives, from optimising daily routines to making data-informed decisions. You'll hear about their motivations, role models, and how they plan to use their position to innovate, inspire and influence change.Many companies today are still missing the golden opportunity to unlock true potential with their data. Our conversations will shed light on this untapped potential as well as the pivotal role that data professionals play in driving progress.“Let's Talk Data” is your go-to source for inspiration and knowledge, providing a front-row seat to the future of data-driven insights and innovations.
Last week, in my hotel room just after MicroConf, I got excited about repositioning. I have had some time to think about the steps forward since then, and here's what I've come up with. This week, I dive into what I have already done, what needs to be done next, and where this is going.The blog post: https://tbf.fm/episodes/383-repositioning-podscan-from-monitoring-to-data-platform The podcast episode: https://thebootstrappedfounder.com/repositioning-podscan-from-monitoring-to-data-platform/Check out Podscan, the Podcast database that transcribes every podcast episode out there minutes after it gets released: https://podscan.fmSend me a voicemail on Podline: https://podline.fm/arvidYou'll find my weekly article on my blog: https://thebootstrappedfounder.comPodcast: https://thebootstrappedfounder.com/podcastNewsletter: https://thebootstrappedfounder.com/newsletterMy book Zero to Sold: https://zerotosold.com/My book The Embedded Entrepreneur: https://embeddedentrepreneur.com/My course Find Your Following: https://findyourfollowing.comHere are a few tools I use. Using my affiliate links will support my work at no additional cost to you.- Notion (which I use to organize, write, coordinate, and archive my podcast + newsletter): https://affiliate.notion.so/465mv1536drx- Riverside.fm (that's what I recorded this episode with): https://riverside.fm/?via=arvid- TweetHunter (for speedy scheduling and writing Tweets): http://tweethunter.io/?via=arvid- HypeFury (for massive Twitter analytics and scheduling): https://hypefury.com/?via=arvid60- AudioPen (for taking voice notes and getting amazing summaries): https://audiopen.ai/?aff=PXErZ- Descript (for word-based video editing, subtitles, and clips): https://www.descript.com/?lmref=3cf39Q- ConvertKit (for email lists, newsletters, even finding sponsors): https://convertkit.com?lmref=bN9CZw
This article by Piotr Czarnas, founder of DQOps, outlines a proven, team-based approach to tackling persistent issues like invalid data, delayed reporting, and inconsistent formats. Published at: https://www.eckerson.com/articles/overcoming-the-challenge-of-low-data-quality
In this episode, Amir sits down with Santhosh Kumar, Head of Data at Trepp, to unpack the evolving world of Data as a Product. Data is no longer just a support function—it's becoming a core business driver. Santhosh shares how data teams are embracing a product-oriented approach, aligning closely with business goals while mirroring software engineering practices.If you've ever wondered:How data can be treated like a shippable productWhat mindset shifts data teams needAnd how collaboration between data, product, and tech teams is evolvingThis episode is for you!
2025 is finally upon us and the trends for healthcare marketers continue to evolve. EHR is quickly growing into an industry favorite for engaging HCPs in a contextually relevant location, brands and agencies are embracing Next Best Engagement (NBE) strategies to fine tune their media, and data quality remains an ever-important aspect of a marketer's toolkit. Join Louis Naimoli, who is the VP of programmatic at Haymarket in this candid conversation with MM+M. Check us out at: mmm-online.com Follow us: YouTube: @MMM-onlineTikTok: @MMMnewsInstagram: @MMMnewsonlineTwitter/X: @MMMnewsLinkedIn: MM+M To read more of the most timely, balanced and original reporting in medical marketing, subscribe here.
In this episode of Generation AI, hosts JC and Ardis tackle one of the most pressing concerns in higher education today: how to trust AI outputs. They explore the psychology of trust in technology, the evaluation frameworks used to measure AI accuracy, and how Retrieval Augmented Generation (RAG) helps ground AI responses in factual data. The conversation offers practical insights for higher education professionals who want to implement AI solutions but worry about accuracy and reliability. Listeners will learn how to evaluate AI systems, what questions to ask vendors, and why having public-facing content is crucial for effective AI implementation.Introduction: The Trust Challenge in AI (00:00:06)JC Bonilla and Ardis Kadiu introduce the topic of trusting AI outputsContrasting traditional predictive modeling metrics with new AI evaluation methodsUnderstanding that trust is both earned and lost through interactionsThe Psychology of Trust in AI (00:03:35)How human psychology frameworks for trust transfer to technologyChallenge appraisal (seeing AI as enhancement) versus threat appraisal (seeing AI as risky)Example: How autonomous driving shows trust being built or lost through micro-decisionsThe importance of making AI systems more predictable to humansEvaluating AI Outputs: The Evals Framework (00:11:41)Moving from traditional machine learning metrics to new evaluation methodsHow OpenAI Evals works as a standard for measuring AI performanceCreating test sets with thousands of variations to check AI outputsThe concept of "AI checking on AI" for more thorough evaluationElement451's achievement of 94-95% accuracy rates on their evaluationsRetrieval Augmented Generation (RAG) Explained (00:27:23)RAG as an "open book exam" approach for AI systemsHow data is processed, categorized, and made searchableThe importance of re-ranking information to find the most relevant contentHow multiple documents can be combined to create accurate answersAddressing Common AI Trust Concerns (00:33:31)Reducing hallucinations through proper grounding in source materialWhy "garbage in, garbage out" fears are often overblownUsing public-facing content as reliable data sourcesThe value of traceable sources in building confidence in AI responsesConclusion: Building Earned Trust (00:38:11)Trust in AI comes from reliability and transparencyThe importance of asking the right questions when selecting AI partnersHow to distinguish between companies just talking about AI versus implementing best practices - - - -Connect With Our Co-Hosts:Ardis Kadiuhttps://www.linkedin.com/in/ardis/https://twitter.com/ardisDr. JC Bonillahttps://www.linkedin.com/in/jcbonilla/https://twitter.com/jbonillxAbout The Enrollify Podcast Network:Generation AI is a part of the Enrollify Podcast Network. If you like this podcast, chances are you'll like other Enrollify shows too! Enrollify is made possible by Element451 — the next-generation AI student engagement platform helping institutions create meaningful and personalized interactions with students. Learn more at element451.com. Attend the 2025 Engage Summit! The Engage Summit is the premier conference for forward-thinking leaders and practitioners dedicated to exploring the transformative power of AI in education. Explore the strategies and tools to step into the next generation of student engagement, supercharged by AI. You'll leave ready to deliver the most personalized digital engagement experience every step of the way.Register now to secure your spot in Charlotte, NC, on June 24-25, 2025! Early bird registration ends February 1st -- https://engage.element451.com/register
Welcome back to Intellicast! Joining Brian Peterson on this jam-packed episode is Gabby Blados. They talk about conferences, data quality, as well as discuss some recent headlines from around the research world. Kicking off the episode, Brian and Gabby talk about the upcoming conferences, including SampleCon and Quirks Chicago. With Gabby attending Quirks, she gives Brian a preview of some of the topics, sessions, and activities she most looks forward to while in Chicago in early April. Next, Brian and Gabby turn their sights to the latest market research news, starting with the launch of the Global Data Quality Initiative updated Data Quality Pledge. Brian and Gabby discuss how they feel this is a step in the right direction to improve overall data quality. They are hesitant, though, since it is a pledge, and there is no one to hold people accountable to its standard other than self-regulation. They both agree that the pledge is probably a step toward some sort of regulation around data quality. In the second data quality story, they discuss the results and key takeaways from the new Data Quality Benchmarking Study released by the Insight Association. Brian and Gabby discuss some of the stats, including some that were somewhat surprising to both of them. You can get your free copy of the Insights Association Data Quality Benchmarking Report here. Next, Brian and Gabby talk about some of the recent headlines from around the market research industry, including ComScore's 2024 results, Glimpse's rebrand to Panoplai, and Disney shutting down FiveThirtyEight. In our final story, they touch on the reports about Kantar potentially looking to sell their Worldpanel/Numerator division. Thanks for listening! If you have headed to Pasadena for SampleCon, be sure to say hello to Kathleen Hock. If you will be in Chicago for Quirk, be sure to connect with Gabby or Abby Synder. Did you miss one of our webinars or want to get some of our whitepapers and reports? You can find it all on our Resources page on our website here. Learn more about your ad choices. Visit megaphone.fm/adchoices
Continuing the last Quality Matters episode, host Andy Reynolds and NCQA Chief Technology Officer, Ed Yurcisin, break down the complexities of the digital transformation in health care quality and explore the importance of high-quality data exchange, particularly in the context of HEDIS reporting and the FHIR interoperability standard. Ed explains how NCQA's work in digital HEDIS measurement not only improves health care quality reporting, but also lays the groundwork for broader industry advancements. By ensuring consistent, standardized data for digital HEDIS, NCQA is setting the stage for better measurement of public health, smoother prior authorization and general data accessibility.The conversation also explores the technical side of digital quality measurement, focusing on Clinical Quality Language (CQL) and the role of HEDIS “engines” in the health care data ecosystem. Ed clarifies the difference between SQL and CQL, and underscores that NCQA's focus is on measures' content, not on building the end-to-end software systems that run measures.Through collaborations like the Digital Quality Implementers Community, NCQA is working to ensure alignment across CQL platforms so everyone is “doing the same math.” Amol Vyas, NCQA Vice President for Interoperability, joins the conversation to explain how a public-private partnership is bringing choice and confidence to the market for CQL engines.Ed reflects on how his international perspective and personal experiences shape his passion for health care data interoperability. He shares how challenges accessing medical records for his family members underscore the need for a seamless, patient-centered health care system. His real-world perspective highlights why creating standardized, high-quality data isn't just a technical challenge, but a crucial factor in helping to ensure better, safer care for all.As the episode wraps, listeners are encouraged to explore NCQA's resources and upcoming events to stay informed on the future of digital quality. Key Quote:“ HEDIS measures are incorporated into government payment programs—for example, Medicare Star Ratings. There's incentive to enable digital HEDIS because it is tied to your CMS Star Ratings and the money a Medicare advantage plan might receive from the government. That's not the case for other important use cases, whether it be public health or prior authorization. So our infrastructure is tied to financial returns incenting organizations to make higher quality data accessible for digital HEDIS. And that means if it's good enough for digital HEDIS, it's been cleansed and analyzed in a way that could be used for public health, could be used for prior authorization—all of these different use cases.”Ed Yurcisin Time Stamps:(02:10) Clearing a Path for Data Quality(05:30) HEDIS “Engines” vs. HEDIS “Calculators”(07:17) Measures' Content vs. Software that Runs Measures(11:18) Digital Quality Implementers Community(19:35) The Need for Data Quality Cuts Close to Home Links:Bulk FHIR Quality Coalition Digital Quality Implementers CommunityNCQA Digital Hub Connect with Ed YurcisinConnect with Amol Vyas
Send Everyday AI and Jordan a text messageYou might not think about the supply chain every day. But every product you use or service you rely on is 100% impacted by the global supply chain. And AI is completely reshaping how it works. Join us to find out how.Newsletter: Sign up for our free daily newsletterMore on this Episode: Episode PageJoin the discussion: Ask Jordan and Julian questions on AI and supply chainsUpcoming Episodes: Check out the upcoming Everyday AI Livestream lineupWebsite: YourEverydayAI.comEmail The Show: info@youreverydayai.comConnect with Jordan on LinkedInTopics Covered in This Episode:1. Role of Generative AI in Supply Chains2. Challenges in the Supply Chain Industry3. Automation and Robotics in Logistics4. Accessibility of AI Solutions5. Data Quality and ManagementTimestamps:00:00 Global Supply Chain Analytics Platform03:56 AI-Driven Procurement Insights09:57 Supply Chain Transparency and Challenges12:37 Meta & Apple Enter Robotics Race17:32 Data Classification Challenges in Industry21:22 Accessible AI: From Chatbots to Agents23:39 Navigating AI Disruption in Product Suites27:02 Data Management and Security EssentialsKeywords:Generative AI, global supply chain, artificial intelligence, machine learning, large language models, data extraction, ERP systems, data classification, supply chain analytics, predictive analytics, scenario planning, robotics, automation, ChatGPT, business impact, ESG compliance, supply chain insights, procurement officer, spend management, minority supplier spend, data quality, predictive insights, scenario analysis, enterprise resource planning, SAP, Oracle, Coders, generative AI applications, supply chain transformation, generative AI impact, technology disruption. Ready for ROI on GenAI? Go to youreverydayai.com/partner
As organisations strive to stay competitive in the age of AI, data trust has become a critical factor for success. Without it, even the most advanced AI initiatives are bound to fall short.With the rapid advancement of technology, prioritising trust in data is essential for unlocking AI's full potential and driving meaningful results. Conversely, a lack of data trust can undermine decision-making, operational efficiency, and the success of AI initiatives.In this episode, Christina Stathopoulos, Founder at Dare to Data, speaks to Jay Limburn, Chief Product Officer at Ataccama, to explore these pressing topics. Together, they share actionable insights, real-world examples, and innovative strategies to help businesses harness the power of trusted data.Key TakeawaysData trust is essential for confident decision-making.AI can significantly reduce mundane tasks.Organizations must focus on their data strategy.Customer experience is a key area for AI application.Data teams are crucial for successful AI initiatives.Proactive data management is becoming the norm.The chief data officer's influence is growing.Data quality and security are critical challenges.AI can enhance regulatory reporting processes.Trust in data is vital for successful AI projects.Chapters00:00 - Introduction to Data Trust and AI Integration09:36 - The Role of AI in Operational Efficiency12:47 - Balancing Short-term and Long-term Data Priorities15:00 - Enhancing Customer Experience through AI19:08 - Aligning Workforce and Culture for AI Success21:03 - Innovative Strategies for Data Quality and Security24:35 - Final Thoughts on Data Trust and AI Success
Today, we're talking to Adam Dille from Quantum Metric and Zeba Hasan from Google. We discuss the importance of data quality for interfacing with AI, the most common mistakes we face when building products with AI, and how to get the rest of the company to buy into the advantages AI has to offer. All of this right here, right now, on the Modern CTO Podcast! To learn more about Quantum Metric, check out their website here: https://www.quantummetric.com/ Produced by ProSeries Media: https://proseriesmedia.com/ For booking inquiries, email booking@proseriesmedia.co
About Erin Weber:Erin Richter Weber is a healthcare leader with 14 years at CAQH. She oversees CAQH CORE, advancing healthcare automation, and CAQH Insights, producing the annual Index report. Erin unites stakeholders to address industry challenges through data-driven innovation. Previously, she consulted for PwC and led research at the Advisory Board Company. She holds a Master's from Harvard and a Bachelor's from Cornell, making her a pivotal voice in healthcare standards and policy.About Don Rucker:Dr. Donald Rucker is the Chief Strategy Officer of 1upHealth and former National Coordinator for Health IT at HHS (2017–2021). He led the ONC's 21st Century Cures Act Interoperability Rule, enabling secure patient access to health data via standardized FHIR APIs. A board-certified physician with clinical informatics expertise, he co-developed the first Windows-based electronic medical record. Dr. Rucker holds degrees from Harvard, the University of Pennsylvania, and Stanford, blending medicine, technology, and leadership.Things You'll Learn:Provider data is the backbone of the healthcare system, powering everything from patient care to billing, and requires standardization to ensure accuracy.The healthcare industry needs to learn from the internet and establish a system similar to domain name services to reduce friction.Data quality is paramount for interoperability, requiring standardized definitions of data elements like location. To improve data, AI should be included. AI can be used to standardize the multiple sources of provider data by merging them and enhancing the quality of that data. The healthcare industry is behind other industries, and boldness comes from adopting solutions that have already been implemented elsewhere. Resources:Connect with and follow Erin Weber on LinkedIn.Follow CAQH on LinkedIn and visit their website.Connect with and follow Don Rucker on LinkedIn.Learn more about 1upHealth on their LinkedIn and website.Check out the latest annual CAQH Index Report here.
How strong is your: Deliverability, Lead Sourcing, Copywriting, & Data Quality? Join us as Tal Baker-Phillips from Lemlist unpacks the secrets to successful prospecting that can elevate your sales game. Tal sheds light on the critical components of prospecting, such as the ever-evolving art of copywriting. Each of these elements is vital, and ignoring even one can derail your efforts. Discover cutting-edge email prospecting strategies that prioritize quality and precision over sheer volume. We explore the power of strategic sending practices, such as inbox rotation, and the importance of A/B testing to maintain deliverability while expanding outreach. You'll learn the significance of focusing on a single problem per email and how value-based calls-to-action can ignite genuine conversations with your prospects. This episode promises to redefine how you connect with your audience and optimize your email communication for maximum impact.
00:00 Introduction00:37 Our Predictions for GTMOps in 202526:33 GTM or GTFO: Is AI email personalization worth it? 37:01 Q&A: When is data quality good enough?Hear more from us:Subscribe to us on Youtube: https://www.youtube.com/channel/UCN-x5u0G03LWmU0Ds_4zR8wSubscribe to our newsletter here: https://www.cs2marketing.com/revenue-growth-architects#subscribe-to-newsletterFollow Crissy on LinkedIn: https://www.linkedin.com/in/crveteresaunders/Follow Charlie on LinkedIn: https://www.linkedin.com/in/charliesaunders/Follow Xander on LinkedIn: https://www.linkedin.com/in/xanderbroeffle/
How Data Quality Fuels Data Usability Join radio host Jim Tate on this special episode from a recent virtual event with Clinical Architecture CEO Charlie Harp along with interoperability expert Didi Davis from The Sequoia Project as they discuss the challenges of interoperability and usability. If clinical data is the driving force for interoperability, then quality is the gas that fuels the data usability engine. Jim, Charlie, and Didi look at the key challenges around usability. To stream our Station live 24/7 visit www.HealthcareNOWRadio.com or ask your Smart Device to “….Play Healthcare NOW Radio”. Find all of our network podcasts on your favorite podcast platforms and be sure to subscribe and like us. Learn more at www.healthcarenowradio.com/listen
Welcome to another insightful episode of Predictable B2B Success! Today, we're diving deep into the ever-evolving world of data observability with Ryan Yackel, a seasoned product strategy leader at IBM. Ryan's expertise helps transform complex data quality issues into streamlined, proactive solutions that drive business success. Join us as Ryan unpacks the critical role of data observability in today's digital age, linking it to broader data governance strategies that resonate at the executive level. He'll share his experiences from open-source conferences in Tel Aviv and New York and discuss the importance of a strong narrative design to differentiate your business in the crowded B2B tech space. Curious about the difference between basic alerting and comprehensive observability? Or how a well-crafted strategic narrative can shift your market positioning? Ryan's insights offer compelling industry knowledge and practical tactics for enhancing data reliability and governance. We'll also delve into how pilot testing and proof-of-concept initiatives can demonstrate real-world value, and the nuances of integrating data observability within IBM's robust tech ecosystem. Whether you're a data engineer, a marketing strategist, or a tech executive, this episode promises to open your eyes to new possibilities in data management. Tune in and discover how to elevate your data strategy to new heights! Some areas we explore in this episode include: Data Observability Campaigns: Awareness efforts and collaborations in the emerging data observability space.Community Engagement: Participation in open-source conferences and tech meetups to discuss technical deployments.Executive-Level Strategy: Aligning data observability with data governance to enhance prioritization.DIY Approach vs. Observability: Comparison between basic alerting/monitoring and comprehensive observability with ML detection.Strategic Narrative and Storytelling: The importance of a strong narrative for effective product communication.Pilot Testing for Proof of Concept: Using pilots to demonstrate the effectiveness of data observability solutions.Data Fabric and Data Mesh: IBM's hybrid architecture and integrating data observability.Data Quality and Observability: The importance of "data quality in motion" and evolving observability tools.Data Acquisition Strategy: Combining top-down and bottom-up approaches for integrating DataBank.IBM Acquisition: The impact of DataBank's acquisition by IBM and cultural integration with AI and quantum computing initiatives.And much, much more...
In this episode of the Eye on AI podcast, we dive into the critical issue of data quality for AI systems with Sedarius Perrotta, co-founder of Shelf. Sedarius takes us on a journey through his experience in knowledge management and how Shelf was built to solve one of AI's most pressing challenges—unstructured data chaos. He shares how Shelf's innovative solutions enhance retrieval-augmented generation (RAG) and ensure tools like Microsoft Copilot can perform at their best by tackling inaccuracies, duplications, and outdated information in real-time. Throughout the episode, we explore how unstructured data acts as the "fuel" for AI systems and why its quality determines success. Sedarius explains Shelf's approach to data observability, transparency, and proactive monitoring to help organizations fix "garbage in, garbage out" issues, ensuring scalable and trusted AI initiatives. We also discuss the accelerating adoption of generative AI, the future of data management, and why building a strategy for clean and trusted data is vital for 2025 and beyond. Learn how Shelf enables businesses to unlock the full potential of their unstructured data for AI-driven productivity and innovation. Don't forget to like, subscribe, and hit the notification bell to stay updated on the latest advancements in AI, data management, and next-gen automation! Stay Updated: Craig Smith Twitter: https://twitter.com/craigss Eye on A.I. Twitter: https://twitter.com/EyeOn_AI (00:00) Introduction and Shelf's Mission (03:01) Understanding SharePoint and Data Challenges (05:29) Tackling Data Entropy in AI Systems (08:13) Using AI to Solve Data Quality Issues (12:30) Fixing AI Hallucinations with Trusted Data (21:01) Gen AI Adoption Insights and Trends (28:44) Benefits of Curated Data for AI Training (37:38) Future of Unstructured Data Management
In this episode, Bart De Muynck, Strategic Advisor and Supply Chain Expert, joins Host Brian Glick, CEO of Chain.io, to discuss: Evolving technology across supply chains from the early days to now Technology implementations and the importance of organizational change when getting started Staying competitive with technology The future of AI in supply chains What supply chain pros should expect for 2025 any beyondBart De Muynck is an Industry Thought Leader with over 30 years of Supply Chain and Logistics experience. Bart has worked for major International companies such as EY, GE Capital, Penske Logistics, PepsiCo as well as several Tech companies. Bart spent 8 years as a VP of Research at Gartner and most recently served as the Chief Industry Officer at project44.
If you were AI, how would you reach your full potential? Check out this episode of Six Five On the Road at AWS re:Invent, where host Keith Townsend is joined by Qlik's Brendan Grady, General Manager, Analytics Business Unit, and Sam Pierson, SVP of Data Business Unit R&D Organization. Learn about the pivotal role of data quality and open architectures in realizing the full potential of AI technologies within organizations. Deep dive into this ⤵️ The essentiality of a robust data foundation as a precursor to AI adoption and the potential pitfalls in the absence of a solid data groundwork. Identifying the hallmarks of a dataset that is genuinely prepared for AI utilization. The significance of open architectures for AI implementations, including the risks of siloed systems in the evolving landscape of AI technology. Insight into how Qlik is assisting its clients to establish strong data foundations and spearhead successful AI projects, complemented by illustrative examples. Reflections on instances where AI initiatives have faltered and the strategies that could have steered these projects back on course.
In this episode, host Amir sits down with Amit Sahani, Global Head of Data at StoneX, to delve into data leadership's practical and strategic aspects. Amit shares invaluable insights on building robust data solutions, tackling data quality challenges, and fostering alignment between data engineering and business objectives. Listeners will gain practical advice on the importance of data governance, leveraging technological advancements like cloud computing, and incorporating feedback loops to drive continuous improvement. This episode is a must-listen for anyone seeking to bridge the gap between data and business. Highlights 00:56 – Grounded Leadership in Data Amit shares his philosophy on leadership and its role in successfully implementing data-driven solutions. 02:30 – Challenges in Delivering Business Solutions Insight into the common hurdles organizations face when aligning data initiatives with business needs. 04:57 – Garbage In, Garbage Out A deep dive into the critical issue of data quality and its impact on achieving high-fidelity business delivery. 10:00 – Adoption and Feedback Loops Why continuous feedback loops are essential for improving data solutions and driving adoption across teams. 13:35 – Investment in Data Quality and Governance Exploring the importance of prioritizing data governance to create a strong foundation for successful data strategies. 22:15 – The Role of Technology in Data Solutions How advancements like cloud computing are shaping the future of data engineering and management. Key Takeaways - Data Quality is Everything: Poor data leads to poor outcomes. Investing in data quality is a non-negotiable for success. - Feedback Loops Drive Improvement: Continuous feedback ensures that data solutions remain relevant and effective. - Governance is Critical: Establishing strong data governance practices enables scalability and compliance. - Technology is an Enabler: Leveraging tools like cloud computing enhances efficiency and flexibility in data solutions. - Alignment is Key: Successful data solutions bridge the gap between engineering capabilities and busi Guest: Amit Sahani is the Global Head of Data at StoneX, with nearly two decades of experience in business transformation, cloud adoption, and international expansion across the US, South America, EMEA, and India. At StoneX, he has built a global data team, launched an in-house data aggregator, and established a market data services group, positioning data as central to the company's digital transformation. With expertise in regulated industries and global operations, Amit drives transformative change through empathy, strategic insight, and a sharp focus on business outcomes, effectively engaging stakeholders at all levels. LinkedIn: https://www.linkedin.com/in/asahani/ ---- Liked this episode? Don't forget to subscribe, leave a review, and share with your network! For more insights, follow The Tech Trek.
Highlights from this week's conversation include:Joyce's Background and Journey in Data (0:39) Technological Growth in Logistics (3:51)Leadership and Communication in Logistics (6:54)Impact of Data Quality (9:13)Significance of Data Entry Accuracy (12:05)Data's Role in Decision Making (16:01)The Cost of Adding Data Points (21:26)Real-Time Data in Logistics (24:28)Understanding Master Data (31:15)Data vs. Information Distinction (33:21)Navigating Change in Data Management (37:35)Career Advice for Data Practitioners and Parting Thoughts (41:10)The Data Stack Show is a weekly podcast powered by RudderStack, the CDP for developers. Each week we'll talk to data engineers, analysts, and data scientists about their experience around building and maintaining data infrastructure, delivering data and data products, and driving better outcomes across their businesses with data.RudderStack helps businesses make the most out of their customer data while ensuring data privacy and security. To learn more about RudderStack visit rudderstack.com.
The Data Stack Show is a weekly podcast powered by RudderStack, the CDP for developers. Each week we'll talk to data engineers, analysts, and data scientists about their experience around building and maintaining data infrastructure, delivering data and data products, and driving better outcomes across their businesses with data.RudderStack helps businesses make the most out of their customer data while ensuring data privacy and security. To learn more about RudderStack visit rudderstack.com.
This episode of Tartlecast emphasizes the critical importance of remembering your TARTLE pin. We explain why your pin is essential for protecting your data and accessing your earnings, and we offer tips for creating a secure and memorable pin. We also address the common issue of forgotten pins and why TARTLE cannot retrieve or reset them for users. Key Takeaways: Your TARTLE pin is the key to accessing your account and protecting your data. TARTLE cannot access or reset your pin due to security and privacy protocols. Choose a pin that is both secure and memorable to avoid losing access to your account. If you forget your pin, there is currently no way to recover it. You will need to create a new account. TCAST is a tech and data podcast co-hosted by Alexander McCaig and Jason Rigby. The podcast delves into the latest trends in Big Data, Artificial Intelligence, and Humanity, examining the evolving landscape of digital transformation and innovation. The show features interviews with data scientists, thought leaders, and industry experts at the forefront of technological advancement for human progress. Listeners can explore a wide range of TCAST episodes on their preferred podcast platforms. Connect with TCAST: Website: https://tartle.co/ Facebook: https://go.tartle.co/fb Instagram: https://go.tartle.co/ig Twitter: https://go.tartle.co/tweet What's your data worth? Find out at https://tartle.co/
We've noticed some confusion about how earnings are displayed on TARTLE. Let's clear things up! Our latest Tartlecast episode explains the correct way to read your wallet balance and why some data elements have a value of less than a penny. TCAST is a tech and data podcast co-hosted by Alexander McCaig and Jason Rigby. The podcast delves into the latest trends in Big Data, Artificial Intelligence, and Humanity, examining the evolving landscape of digital transformation and innovation. The show features interviews with data scientists, thought leaders, and industry experts at the forefront of technological advancement for human progress. Listeners can explore a wide range of TCAST episodes on their preferred podcast platforms. Connect with TCAST: Website: https://tartle.co/ Facebook: https://go.tartle.co/fb Instagram: https://go.tartle.co/ig Twitter: https://go.tartle.co/tweet What's your data worth? Find out at https://tartle.co/
This Tartlecast episode clarifies the difference between the small rewards given for completing actions on TARTLE (like creating a wallet or publishing a data packet) and the larger earnings that come from bids placed on your data packets. We explain why TARTLE offers these small rewards and how they contribute to the overall goal of creating a universal basic income. TCAST is a tech and data podcast co-hosted by Alexander McCaig and Jason Rigby. The podcast delves into the latest trends in Big Data, Artificial Intelligence, and Humanity, examining the evolving landscape of digital transformation and innovation. The show features interviews with data scientists, thought leaders, and industry experts at the forefront of technological advancement for human progress. Listeners can explore a wide range of TCAST episodes on their preferred podcast platforms. Connect with TCAST: Website: https://tartle.co/ Facebook: https://go.tartle.co/fb Instagram: https://go.tartle.co/ig Twitter: https://go.tartle.co/tweet What's your data worth? Find out at https://tartle.co/
In this episode of Tartlecast, we address a common question among TARTLE users: "What happens to my data packet after I publish it?" We clarify the process, explaining that published data packets are securely stored in the user's data vault and are not sold, moved, or transferred until a bid is placed on them. We also discuss the importance of completing data packets and building a robust data vault to attract potential buyers. Key Takeaways: Published data packets are NOT immediately sold. They are stored in your secure data vault. TARTLE does not have access to your data vault. Only YOU have the keys. Once a bid is placed on your data packet, you will be notified and can choose to accept or decline the offer. Completing more data packets increases your visibility to buyers and your earning potential. TCAST is a tech and data podcast co-hosted by Alexander McCaig and Jason Rigby. The podcast delves into the latest trends in Big Data, Artificial Intelligence, and Humanity, examining the evolving landscape of digital transformation and innovation. The show features interviews with data scientists, thought leaders, and industry experts at the forefront of technological advancement for human progress. Listeners can explore a wide range of TCAST episodes on their preferred podcast platforms. Connect with TCAST: Website: https://tartle.co/ Facebook: https://go.tartle.co/fb Instagram: https://go.tartle.co/ig Twitter: https://go.tartle.co/tweet What's your data worth? Find out at https://tartle.co/
We've heard your questions, and we're here to help! Our latest Tartlecast episode tackles the common question of why some data packets don't receive bids. Learn how to make your data more appealing to buyers and increase your earning potential on TARTLE. TCAST is a tech and data podcast co-hosted by Alexander McCaig and Jason Rigby. The podcast delves into the latest trends in Big Data, Artificial Intelligence, and Humanity, examining the evolving landscape of digital transformation and innovation. The show features interviews with data scientists, thought leaders, and industry experts at the forefront of technological advancement for human progress. Listeners can explore a wide range of TCAST episodes on their preferred podcast platforms. Connect with TCAST: Website: https://tartle.co/ Facebook: https://go.tartle.co/fb Instagram: https://go.tartle.co/ig Twitter: https://go.tartle.co/tweet What's your data worth? Find out at https://tartle.co/
TARTLE's executives discuss how their Real Intelligence API serves different market segments in AI development, from startups to government agencies. The episode reveals how the API provides 50% of the total AI/ML lifecycle needs through a single integration point, offering ethical data sourcing and real-time human feedback across diverse global demographics. Key Segments Covered: Enterprise AI Departments: • S3 bucket integration capabilities • Streamlined legal compliance • Scalable data acquisition solutions • 20-30 minute integration timeline AI Startups: • Initial 1,000-5,000 data sample requirements • Cost-effective scaling options • Market differentiation through unique datasets • Real-time feedback for rapid development Government Initiatives: • Fourth, Fifth, and Sixth Amendment compliance • Auditable data trails • Representative population sampling • Direct public engagement capabilities Integration Benefits: • Flexible volume scaling • Documentation support • Quick implementation • Custom data collection options TCAST is a tech and data podcast co-hosted by Alexander McCaig and Jason Rigby. The podcast delves into the latest trends in Big Data, Artificial Intelligence, and Humanity, examining the evolving landscape of digital transformation and innovation. The show features interviews with data scientists, thought leaders, and industry experts at the forefront of technological advancement for human progress. Listeners can explore a wide range of TCAST episodes on their preferred podcast platforms. Connect with TCAST: Website: https://tartle.co/ Facebook: https://go.tartle.co/fb Instagram: https://go.tartle.co/ig Twitter: https://go.tartle.co/tweet What's your data worth? Find out at https://tartle.co/
In this episode of Tartlecast, we address a common question from our global community: "Why can't I withdraw my earnings to PayPal in my country?" We discuss the challenges of international payments and the steps TARTLE is taking to provide more withdrawal options for our members worldwide. We also emphasize the importance of continuing to build your data vault on TARTLE, even while we work on expanding payment solutions. Key Takeaways: TARTLE is aware that PayPal is not available in all countries. We are actively working to add more withdrawal options, including bank transfers, digital payment methods, and gift cards. Your earnings are safe and secure on TARTLE, even if you cannot withdraw them immediately. We encourage you to continue completing data packets and building your data vault, as this helps us attract more buyers for your data. TCAST is a tech and data podcast co-hosted by Alexander McCaig and Jason Rigby. The podcast delves into the latest trends in Big Data, Artificial Intelligence, and Humanity, examining the evolving landscape of digital transformation and innovation. The show features interviews with data scientists, thought leaders, and industry experts at the forefront of technological advancement for human progress. Listeners can explore a wide range of TCAST episodes on their preferred podcast platforms. Connect with TCAST: Website: https://tartle.co/ Facebook: https://go.tartle.co/fb Instagram: https://go.tartle.co/ig Twitter: https://go.tartle.co/tweet What's your data worth? Find out at https://tartle.co/