In today’s rapidly-changing technological environment, data offers new ways for business to improve or even completely transform business models. But with this rapid change, it’s difficult to separate hype from reality and determine what practical steps can show value in real-world implementations.…
In this episode of the Don't Panic, It's Just Data podcast, hosted by EM360Tech's podcast producer, Shubhangi Dua speaks to Donnie Owsley from Snohomish County, and Jeff Burton and Tom Lavery from the University of British Columbia. All of the speakers will be presenting at the upcoming Peak of Data and AI event, organised by Safe Software, the creators of FME. Scheduled to take place in Seattle from May 5th to 8th, 2025, The Peak is an exciting gathering for data and AI innovators. This conversation offers a preview of some of the practical applications and insights that will be shared at the event.The podcast also talks about the development of creative solutions for enhancing accessibility in urban environments. The UBC speakers particularly refer to their creation of an accessible university campus navigation system, a project that showcases the power of integrating FME with platforms like ArcGIS. This discussion spotlights the challenges and ingenuity involved in building inclusive wayfinding solutions that cater to the diverse needs of a community.The conversation sheds light on some tangible ways in which FME is being used across different sectors to tackle specific challenges and boost creative innovations. It provides valuable context for the types of practical knowledge and problem-solving approaches that will be central to The Peak of Data and AI event.For further information on what we've talked about and to register for The Peak of Data and AI event in Seattle, please head over to peakofdataintegration.com.Key HighlightsDiscover how to use tools like FME for preemptive IT issue resolution.Learn the approach to creating inclusive navigation systems with FME and ArcGIS.Get practical insights into current industry applications.Preview actionable data and AI solutions.Explore the versatile application of FME in your organisation.About Safe SoftwareFounded in 1993, Safe is headquartered in Surrey, BC with over 200 team members and counting. We're always looking for talented individuals with diverse backgrounds who are determined to learn and grow.Over 20,000 organisations around the world use FME in industries like AEC, government, utilities, and transportation to maximise the value of their data.
In this episode of "Don't Panic, It's Just Data," host Christina Stathopoulos explores the world of real-time analytics and its impact on financial decision-making. She is joined by Thomas Gore, insightsoftware's Director of Product Management for extended planning and analysis (XP&A), and Cody Riemenschneider, Director of Solutions Engineering, and they discuss the challenges and opportunities of integrating real-time data into finance.
"So you want trusted data, but you want it now? Building this trust really starts with transparency and collaboration. It's not just technology. It's about creating a single governed view of data that is consistent no matter who accesses it, " says Errol Rodericks, Director of Product Marketing at Denodo.In this episode of the 'Don't Panic, It's Just Data' podcast, Shawn Rogers, CEO at BARC US, speaks with Errol Rodericks from Denodo. They explore the crucial link between trusted data and successful AI initiatives. They discuss key factors such as data orchestration, governance, and cost management within complex cloud environments. We've all heard the horror stories – AI projects that fail spectacularly, delivering biased or inaccurate results. But what's the root cause of these failures? More often than not, it's a lack of focus on the data itself. Rodericks emphasises that "AI is only as good as the data it's trained on." This episode explores how organisations can avoid the "garbage in, garbage out" scenario by prioritising data quality, lineage, and responsible AI practices. Learn how to avoid AI failures and discover strategies for building an AI-ready data foundation that ensures trusted, reliable outcomes. Key topics include overcoming data bias, ETL processes, and improving data sharing practices.TakeawaysBad data leads to bad AI outputs.Trust in data is essential for effective AI.Organisations must prioritise data quality and orchestration.Transparency and collaboration are key to building trust in data.Compliance is a responsibility for the entire organisation, not just IT.Agility in accessing data is crucial for AI success.Chapters00:00 The Importance of Data Quality in AI02:57 Building Trust in Data Ecosystems06:11 Navigating Complex Data Landscapes09:11 Top-Down Pressure for AI Strategy11:49 Responsible AI and Data Governance15:08 Challenges in Personalisation and Compliance17:47 The Role of Speed in Data Utilisation20:47 Advice for CFOs on AI InvestmentsAbout DenodoDenodo is a leader in data management. The award-winning Denodo Platform is the leading logical data management platform for transforming data into trustworthy insights and outcomes for all data-related initiatives across the enterprise, including AI and self-service. Denodo's customers in all industries all over the world have delivered trusted AI-ready and business-ready data in a third of the time and with 10x better performance than with lakehouses and other mainstream data platforms alone.
In this episode of our 'Don't Panic, It's Just Data' podcast series, Christina Stathopoulos, Founder at Dare to Data, speaks with Matthew Cawsey and Arnjah Dillard from Stibo Systems. Matthew and Arnjah explain the importance of CMDM and CXDC for maintaining positive customer engagement. Learn how to move from basic personalisation, ethically use data, leverage AI for a better customer experience. Let's transform your messy data into a competitive advantage.
In the latest episode of the Don't Panic It's Just Data podcast, we connected with speakers who provided a preview of their presentations at the upcoming Peak of Data and AI event in Seattle organised by Safe Software from May 5-8, 2025. This premier gathering, hosted by Safe Software, the creators of FME, will be a hub for data and AI innovators, and this podcast episode offers an exclusive look into what attendees can expect.Our conversation featured Margaret Smith and Reshma Joy from the West Virginia Department of Transportation. They shared their crucial work in ensuring data integrity through rigorous validations of their Linear Reference System data. This foundational work underpins much of their operational efficiency and decision-making. They further revealed how they've achieved seamless integration between Survey123 and their R&H data, showcasing a strong example of how disparate systems can be harmonized for greater insight. This presentation will provide attendees with actionable strategies for enhancing data quality and interoperability.We also spoke to Bruno Blanco, a GIS Engineer from Shelby County 9-1-1. Bruno walked us through how FME supports critical aspects of their 911 addressing workflow—particularly data aggregation, QA/QC, and attribution—within a larger automation framework. This work highlighted the power of automation in critical public safety infrastructure. By streamlining their addressing processes, Shelby County 9-1-1 is improving response times and ensuring more accurate location data, ultimately saving lives. Bruno's presentation will offer valuable insights into how organisations can leverage FME to automate complex workflows and enhance operational efficiency.This episode serves as a compelling preview for the main event at The Peak of Data and AI. If you'd like to learn more about Bruno and Shelby County 9-1-1's story, check out their success story with Safe. For further information on what we've talked about and to register for The Peak of Data and AI event in Seattle, please head over to peakofdataintegration.com.TakeawaysData validation is essential for accurate operations.FME enables seamless integration of disparate systems.Automation of critical processes improves public safety.Networking and community learning are key benefits of The Peak.Breakout sessions provide valuable hands-on FME knowledge.AI is increasingly influencing data integration workflows.
"There's been a lot of wrangling of data, a lot of wrangling of humans as well, which is a big theme for today," says Warwick Leitch, Product Management Director at insightsoftware.In this episode of the 'Don't Panic, It's Just Data' podcast, Debbie Reynolds, CEO and Chief Data Privacy Officer at Debbie Reynolds Consulting LLC, speaks with Leitch from insightsoftware. They discuss the vital role of financial strategy and collaborative planning, particularly as it pertains to the decisions made by IT executives.The question they address is: In a world awash with data, how do we transform it into actionable insights? Warwick shares his wealth of experience, offering practical advice and illuminating the path to successful budgeting and forecasting.One such challenge addressed in the podcast is how organisations are securing executive buy-in. "And 51 percent of people find it difficult to engage senior executives to buy into the process, which is a roadblock. And 57 percent of organisations struggle cross-functionally," Warwick reveals. It's not just about the numbers. Warwick also emphasises the human element, reminding us that "Without people, we don't have anything." In an era where AI looms large, it's crucial to remember that technology serves to enhance, not replace, human collaboration.Tune in to the podcast and learn how to navigate the complexities of financial strategy and collaborative planning.TakeawaysExecutive buy-in is crucial for successful budgeting.A clear vision helps guide the budgeting process.Thoughtful execution is key to effective planning.Fostering a culture of collaboration enhances participation.Data accuracy is vital in today's fast-paced environment.Avoid overcomplicating the budgeting process.Gamification can improve engagement in budgeting.AI can significantly streamline forecasting and reporting.Regularly updating forecasts leads to better accuracy.Understanding business measures is essential for effective planning.Chapters00:00 Introduction to Collaborative Financial Planning05:01 The Importance of Executive Buy-In10:09 Thoughtful Execution in Budgeting15:01 Fostering a Culture of Collaboration19:48 Defining Business Measures and Data Accuracy24:52 Common Pitfalls in Collaborative Budgeting29:52 The Future of Collaborative Planning with AI
The digital age is fueled by data, and the engines powering that data are data centres. However, this growth comes at a significant energy cost. In the latest episode of the EM360Tech Don't Panic It's Just Data podcast, Shubhangi Dua speaks with Rolf Bienert, Technical & Managing Director of the OpenADR Alliance, to shed light on the urgent need for sustainable energy practices within the data centres industry. In this episode, we discuss the stark reality of escalating energy consumption, driven by factors like the rise of AI, and the critical importance of moving beyond superficial "green" initiatives to implement genuine, impactful solutions.From talking about the historical context of data centres energy usage, the evolution of energy demands and the challenges of achieving net-zero goals, Rolf provides valuable insights into innovative solutions such as smart grids, microgrids, and virtual power plants. These hold immense potential for managing energy distribution efficiently and sustainably. Beyond technological solutions, the podcast addresses the critical role of regulatory frameworks and industry standards in fostering sustainable practices. The frameworks are necessary to adapt to modern energy consumption patterns, ensuring interoperability and reducing costs. It spotlights the importance of collaboration between IT and utility sectors, as well as open communication with the public, to address concerns about energy consumption and build trust. TakeawaysData centres are increasingly becoming significant consumers of energy.Sustainability in data centre is often perceived as branding rather than genuine effort.AI's demand for processing power is escalating energy needs.Smart grids are essential for managing energy distribution effectively.Microgrids and virtual power plants offer promising solutions for energy sustainability.Enterprises can leverage renewable energy to become energy providers.Regulatory frameworks need to adapt to modern energy consumption patterns.Standards are crucial for ensuring interoperability and reducing costs.Collaboration between IT and utility sectors is vital for sustainable energy management.Open communication is key to addressing public concerns about energy consumption.Chapters00:00 Introduction to Data Centres Sustainability03:22 Historical Perspective on Data centres Energy Consumption08:32 The Role of Smart Grids in Energy Management12:40 Understanding Microgrids and Virtual Power Plants21:30 Enterprise Strategies for Sustainable Data Centres29:51 Regulatory Challenges and Opportunities32:34 The Importance of Standards in Data Centres Growth
As organisations strive to stay competitive in the age of AI, data trust has become a critical factor for success. Without it, even the most advanced AI initiatives are bound to fall short.With the rapid advancement of technology, prioritising trust in data is essential for unlocking AI's full potential and driving meaningful results. Conversely, a lack of data trust can undermine decision-making, operational efficiency, and the success of AI initiatives.In this episode, Christina Stathopoulos, Founder at Dare to Data, speaks to Jay Limburn, Chief Product Officer at Ataccama, to explore these pressing topics. Together, they share actionable insights, real-world examples, and innovative strategies to help businesses harness the power of trusted data.Key TakeawaysData trust is essential for confident decision-making.AI can significantly reduce mundane tasks.Organizations must focus on their data strategy.Customer experience is a key area for AI application.Data teams are crucial for successful AI initiatives.Proactive data management is becoming the norm.The chief data officer's influence is growing.Data quality and security are critical challenges.AI can enhance regulatory reporting processes.Trust in data is vital for successful AI projects.Chapters00:00 - Introduction to Data Trust and AI Integration09:36 - The Role of AI in Operational Efficiency12:47 - Balancing Short-term and Long-term Data Priorities15:00 - Enhancing Customer Experience through AI19:08 - Aligning Workforce and Culture for AI Success21:03 - Innovative Strategies for Data Quality and Security24:35 - Final Thoughts on Data Trust and AI Success
The San Antonio River Authority (SARA) has experienced a transformative shift in data management, thanks to the powerful capabilities of FME. By integrating FME, SARA has streamlined data integration, improved efficiency, and enhanced decision-making processes across multiple departments. FME's ability to automate data transformation, standardise formats, and manage large volumes of spatial data has allowed the authority to optimise workflows, reduce manual errors, and accelerate project timelines.A key highlight of SARA's success with FME is its use in predictive flood modelling and the standardisation of data workflows. By leveraging FME, SARA can more accurately predict flood risks, improving public safety and response times. This innovation not only enhances internal operations but also helps SARA lead in sustainable water management. With its versatility in handling diverse data sources and streamlining communication between systems, FME is a powerful investment for organisations seeking to improve operational efficiency and long-term strategic decision-making.In this episode, Debbie Reynolds, Founder and Chief Data Privacy Officer at Debbie Reynolds Consulting, speaks to Jordan Merson, Enterprise Applications Supervisor at San Antonio River Authority, about the game-changing impact of FME.Key Takeaways: Data management challenges often stem from a lack of standardisation.FME allows for the integration of various data sources seamlessly.Predictive modelling can enhance emergency response efforts.FME provides tools for real-time data monitoring and alerts.The user-friendly interface of FME accelerates onboarding for new team members.FME can handle both spatial and non-spatial data effectively.Collaboration and knowledge sharing are key to successful data management.Chapters: 00:00 - Introduction to Data Management and FME02:30 - Jordan's Journey in IT and Data Management05:43 - Challenges Before FME Implementation08:36 - Transformative Impact of FME on Data Processes10:02 - Real-World Applications of FME at San Antonio River Authority14:06 - Predictive Flood Modeling and Emergency Operations16:31 - Standardization and Efficiency with FME17:59 - Final Thoughts and Recommendations on FME
SummaryThis discussion explores the complexities and strategies surrounding edge computing and data management, highlighting the importance of security, the challenges of vendor lock-in, the implications of data repatriation, and the necessity of ensuring high-quality data for AI systems. It emphasises the need for organisations to balance edge processing with centralised storage while future-proofing their data strategies against rapid technological changes.Building on their discussion, Jimmy Tam highlights the transformative role of edge computing in modern data management, emphasising the importance of governance, compliance, and interoperability to address the challenges of data sprawl and vendor lock-in. TakeawaysEdge computing is transforming how organisations manage data.Security at the edge is paramount to prevent intrusions.Data sprawl poses significant challenges for edge data management.Governance and compliance are essential for effective data management.Vendor lock-in can limit flexibility and adaptability in technology.Data interoperability is crucial for avoiding vendor lock-in.Data repatriation is a growing trend among organisations.AI systems require access to comprehensive data for training.Speed of data relevance is critical for effective AI applications.Flexibility in data strategies is essential for future-proofing organisations.Sound Bites"Data sprawl is a significant problem.""Governance and compliance are crucial.""Data repatriation is absolutely real.""Speed of data relevance is critical."Chapters00:00 Introduction to Edge Computing and Data Management02:53 Security Strategies for Edge Data06:06 Vendor Lock-In and Data Interoperability09:00 Data Repatriation and Cost Optimisation11:57 Ensuring Quality Data for AI Systems14:46 Balancing Edge Processing and Centralised Storage17:59 Future-Proofing Data Strategies
In today's data-driven world, real-time analytics has become a cornerstone for businesses seeking to make smarter, faster decisions. From enhancing user experiences to enabling continuous intelligence, the ability to process data in real-time is transforming industries. Yet, challenges such as legacy systems and the demand for innovative data management approaches persist.This episode explores the evolution of real-time analytics and its crucial role in modern data processing. We delve into how technology is reshaping the way businesses interact with data and the importance of user-centric design in creating powerful data applications.Joining Christina Stathopoulos, Founder of Dare to Data, is Rahul Rastogi, Chief Innovation Officer at SingleStore. Together, they discuss the necessity of real-time data in today's fast-paced business environment, tackle the challenges organizations face in adapting to this shift, and highlight how data serves as the foundation for AI-driven innovation. Don't miss this insightful discussion packed with practical strategies and forward-looking ideas!Key TakeawaysReal-time analytics has evolved from a luxury to a necessity.Streaming technologies like Kafka and Spark have revolutionized data processing.Legacy systems are often monolithic and ill-suited for real-time analytics.Modern data platforms enable easier data management and integration.Continuous intelligence requires a solid analytics foundation.User experience is critical for the adoption of data applications.Organizations must treat data as a valuable asset.Data governance and quality are essential for effective analytics.The separation of compute from storage enhances scalability.Real-time processing with low latency improves user satisfaction.Chapters00:00 - Introduction to Real-Time Analytics06:14 - The Evolution of Technology in Data Processing10:09 - Challenges of Legacy Systems14:23 - Innovative Approaches to Data Management18:06 - Building a Foundation for AI Innovations21:27 - User Experience in Data Applications
The convergence of Master Data Management (MDM) and Artificial Intelligence (AI) is transforming how businesses harness data to drive innovation and efficiency. MDM provides the foundation by organising, standardising, and maintaining critical business data, ensuring consistency and accuracy across an organisation. When paired with AI, this clean and structured data becomes a powerful asset, enabling advanced analytics, predictive insights, and intelligent automation. MDM and AI help businesses uncover hidden patterns, streamline operations, and make more informed decisions in real-time. By integrating MDM with AI, organisations can move beyond simply managing data to actively leveraging it for competitive advantage. AI algorithms thrive on high-quality, well-structured data, and MDM ensures just that—minimising errors and redundancies that could compromise results. This synergy empowers companies to personalise customer experiences, optimise supply chains, and respond proactively to market changes. In this episode, Kevin Petrie, VP of Research at BARC US, speaks to Jesper Grode, Director of Product Innovation at Stibo Systems, about the intersection between AI and MDM. Key Takeaways: AI and master data management should be integrated for better outcomes.Master data improves the quality of inputs for AI models.Accurate data is crucial for training machine learning models.Generative AI can enhance product launch processes.Prompt engineering is essential for generating accurate AI responses.AI can optimise MDM processes and reduce operational costs.Fast prototyping is vital for successful AI implementation.Chapters: 00:00 - Introduction to AI and Master Data Management02:59 - The Synergy Between AI and Master Data05:49 - Generative AI and Master Data Management09:12 - Leveraging Master Data for Small Language Models11:58 - AI's Role in Optimizing Master Data Management14:53 - Best Practices for Implementing AI in MDM
As cloud adoption grows, so do the challenges of managing costs effectively. Cloud environments offer scalability and flexibility but often come with hidden fees, unpredictable expenses, and resource sprawl that can quickly inflate budgets. Without the right tools and strategies, businesses may struggle to track spending, identify waste, and maintain budget alignment. Usage-based reporting is pivotal in this process, providing the granular visibility needed to understand real-time consumption patterns and optimise costs. Businesses can align expenses directly with value-driven activities by tracking how, where, and when resources are used. From preventing overspending to fostering accountability, usage-based reporting empowers teams to proactively manage their cloud expenses, turning cloud cost management into a strategic advantage rather than a recurring headache.In this episode, George Firican, Founder of LightsOnData, speaks to Rem Baumann, Resident FinOps Expert at Vantage, about usage-based reporting and its benefits. Key Takeaways: Organisations face challenges in tracking complex cloud costs.Usage-based reporting provides context to cloud spending.Metrics should align with business goals for effective decision-making.Communication between finance and engineering teams is crucial.Identifying cost optimisation opportunities can lead to significant savings.Different industries require customised cost metrics.Cloud providers offer basic tools, but deeper insights are needed.Regular monitoring of metrics ensures financial transparency.Chapters: 00:00 - Introduction to Cloud Cost Management03:03 - Understanding Cloud Complexity and Cost Tracking05:53 - The Role of Usage-Based Reporting09:06 - Metrics for Cost Optimization12:02 - Industry-Specific Applications of Cost Metrics14:49 - Aligning Cloud Costs with Business Goals18:09 - Conclusion and Key Takeaways
Data custodianship today involves managing and protecting vast quantities of sensitive information, requiring organisations to ensure security, regulatory compliance, and ethical usage. It's not just about protecting data from breaches but also about responsible storage, access, and deletion that aligns with strict industry standards and evolving privacy regulations.The ethical dimensions of data custodianship add further complexity as organisations balance the need for data-driven insights with privacy rights and transparent usage. Mismanagement can lead to significant financial, legal, and reputational risks, making effective custodianship essential for maintaining customer trust and regulatory compliance.In this episode, Paulina Rios Maya, Head of Industry Relations, speaks to Debbie Reynolds, Founder and Chief Data Privacy Officer at Debbie Reynolds Consulting, about compliance with global regulations, the role of AI in data management, and the necessity of human oversight in technology.Key Takeaways: Data custodianship emphasises that data belongs to individuals, not companies.Organisations must have a comprehensive plan for data management throughout its lifecycle.Transparency and communication with consumers are essential in data handling.Different types of data require different levels of protection based on risk.Building trust with consumers requires responsible data practices.Organisations need to prioritise basic data protection strategies over compliance with every regulation.Chapters: 00:00 - Introduction to Data Custodianship03:03 - Understanding Responsibilities in Data Handling05:59 - Balancing Innovation and Data Protection08:45 - Building Trust Through Responsible Data Practices12:07 - Navigating Compliance and Data Governance14:54 - Leveraging AI for Enhanced Data Custodianship18:06 - The Role of Humans in Technology and Data Management
Generative AI and unstructured data are transforming how businesses improve customer experiences and streamline internal processes. As technology evolves, companies find new ways to gain insights, automate tasks, and personalize interactions, unlocking new growth opportunities. The integration of these technologies is reshaping operations, driving efficiency, and enhancing decision-making, helping businesses stay competitive and agile in a rapidly changing landscape. Organizations that embrace these innovations can better adapt to customer needs and market demands, positioning themselves for long-term success.In this episode, Doug Laney speaks to Katrina M. Conn, Senior Practice Director of Data Science at Teradata, and Sri Raghavan, Principal of Data Science and Analytics at AWS, about sustainability efforts and the ethical considerations surrounding AI. Key Takeaways:Generative AI is being integrated into various business solutions.Unstructured data is crucial for enhancing customer experiences.Real-time analytics can improve customer complaint resolution.Sustainability is a key focus in AI resource management.Explainability in AI models is essential for ethical decision-making.The combination of structured and unstructured data enhances insights.AI innovations are making analytics more accessible to users.Trusted AI frameworks are vital for security and governance.Chapters: 00:00 - Introduction to the Partnership and Generative AI02:50 - Technological Integration and Market Expansion06:08 - Leveraging Unstructured Data for Insights08:55 - Innovations in Customer Experience and Internal Processes11:48 - Sustainability and Resource Optimization in AI15:08 - Ensuring Ethical AI and Explainability23:57 - Conclusion and Future Directions
In this episode, Rachel Thornton, Fivetran's CMO, discusses the highlights of Big Data London 2024, including the launch of Fivetran Hybrid Deployment, which addresses the needs of organisations with mixed IT environments.The conversation delves into integrating AI into business operations, emphasizing the importance of a robust data foundation. Additionally, data security and compliance challenges in the context of GDPR and other regulations are explored. The episode concludes with insights on the benefits of hybrid deployment for organisations.Key Takeaways: Big Data London 2024 is a significant event for data leaders.Fivetran Hybrid Deployment caters to organizations with mixed IT environments.AI integration requires a strong data foundation.Data security and compliance are critical in today's landscape.Organizations must understand their data sources for effective AI use.Hybrid deployment allows for secure data management.Compliance regulations are becoming increasingly stringent. Data readiness is essential for AI integration.Chapters: 00:00 - Introduction to Big Data London 202402:46 - Launch of Fibntran Hybrid Deployment06:06 - Integrating AI into Business Operations08:54 - Data Security and Compliance Challenges11:50 - Benefits of Hybrid Deployment
Managing network traffic efficiently is essential to control cloud costs. Network flowreports are critical in providing detailed insights into data movement across cloudenvironments. These reports help organisations identify usage patterns, trackbandwidth consumption, and uncover inefficiencies that may lead to higher expenses.With a clear understanding of how data flows, businesses can make informed decisionsto optimise traffic, reduce unnecessary data transfers, and allocate resources moreeffectively. This helps lower cloud costs, improves network performance, and enhancessecurity by revealing unusual or potentially harmful traffic patterns.In this episode, Wayne Eckerson from Eckerson Group speaks to Ben Schaechter,CEO of Vantage, about optimising network traffic costs with Vantage's Network FlowReports.Key Takeaways:● Network Flow Reports provide detailed insights into AWS costs.● They help identify specific resources driving network traffic costs.● Organisations can reduce costs by up to 90% with proper configuration.● The shift towards cost management in cloud services is critical.● FinOps teams are becoming essential for cloud cost optimization.● Anomaly detection can alert teams to unexpected cost spikes.● Vantage integrates with multiple cloud providers for comprehensive costmanagement.● Effective cost management does not have to impact production workflows.Chapters:00:00 - Introduction to Vantage and Network Flow Reports02:52 - Understanding Network Flow Reports and Their Impact06:09 - Real-World Applications and Case Studies09:03 - The Shift in Cost Management Focus11:54 - Tangible Benefits of Implementing Network Flow Reports15:07 - The Role of FinOps in Cost Optimization18:00 - Conclusion and Future Insights
Safe Software's FME is transforming Omaha's approach to urban mobility with groundbreaking solutions for asset management, e-scooter tracking, and parking management. FME's robust data integration capabilities are at the core of Omaha's advancements. The data integration platform enables real-time tracking of e-scooters, offering precise data on their locations and usage patterns. This innovation enhances the management and accessibility of e-scooters, making urban mobility more efficient and user-friendly.Automated parking management processes, facilitated by FME, streamline city operations and reduce manual efforts. This automation leads to smoother parking experiences for residents and visitors, while dynamic rate adjustments, powered by FME, ensure that parking fees are responsive to real-time demand, optimising availability and revenue.In this episode, Wayne Eckerson from Eckerson Group speaks to Jacob Larson, an Applications Analyst from the City of Omaha, to discuss Omaha's usage of FME. Key Takeaways:FME helps automate the tracking of e-scooters in real time.Data sharing agreements with providers like Lime enhance tracking capabilities.Omaha's parking management has been transformed through automation.FME allows for dynamic changes in parking rates based on events.The integration of GIS data with third-party APIs is crucial for parking management.Omaha is pioneering a real-time parking information system in the US.Chapters: 00:00 - Introduction to Omaha's Data Initiatives01:03 - FME's Role in Asset Management04:53 - Real-Time Tracking of E-Scooters07:48 - Automating Parking Management10:06 - Innovations in Parking Availability12:59 - Dynamic Parking Rate Management
Safe Software's FME is transforming Omaha's approach to urban mobility with groundbreaking solutions for asset management, e-scooter tracking, and parking management. FME's robust data integration capabilities are at the core of Omaha's advancements. The data integration platform enables real-time tracking of e-scooters, offering precise data on their locations and usage patterns. This innovation enhances the management and accessibility of e-scooters, making urban mobility more efficient and user-friendly.Automated parking management processes, facilitated by FME, streamline city operations and reduce manual efforts. This automation leads to smoother parking experiences for residents and visitors, while dynamic rate adjustments, powered by FME, ensure that parking fees are responsive to real-time demand, optimising availability and revenue.In this episode, Wayne Eckerson from Eckerson Group speaks to Jacob Larson, an Applications Analyst from the City of Omaha, to discuss Omaha's usage of FME. Key Takeaways:FME helps automate the tracking of e-scooters in real time.Data sharing agreements with providers like Lime enhance tracking capabilities.Omaha's parking management has been transformed through automation.FME allows for dynamic changes in parking rates based on events.The integration of GIS data with third-party APIs is crucial for parking management.Omaha is pioneering a real-time parking information system in the US.Chapters: 00:00 - Introduction to Omaha's Data Initiatives01:03 - FME's Role in Asset Management04:53 - Real-Time Tracking of E-Scooters07:48 - Automating Parking Management10:06 - Innovations in Parking Availability12:59 - Dynamic Parking Rate Management
Open source technologies are transforming how businesses manage real-time data on cloud platforms. By leveraging flexible, scalable, and cost-effective open-source tools, organisations can process and analyse large volumes of data with speed and precision. These technologies offer unmatched transparency, customisation, and community-driven innovation, making them ideal for real-time monitoring, analytics, and IoT applications.As data demands grow, open-source solutions ensure that businesses stay agile, reduce vendor lock-in, and maintain full control over their cloud infrastructure. The result? Faster insights, smarter decision-making, and enhanced performance—all powered by open source. In this episode, Paulina Rios Maya, Head of Industry Relations at EM360Tech, speaks to Mikhail Epikhin, Chief Technology Officer at Double Cloud, about The Power of Open Source in Cloud Platforms. Key Takeaways:Open-source technologies provide standard building blocks for products.Community-driven innovation is essential for the evolution of technology.Flexibility in data infrastructure is crucial for real-time processing.Observability and monitoring are vital for performance optimisation.Managed services can accelerate product development and feature implementation.Chapters:00:00 - The Power of Open Source in Cloud Platforms05:24 - Apache Airflow: Enhancing Real-Time Data Management10:08 - Balancing Open Source and Managed Services13:57 - Best Practices for Scalability and Performance
Big Data LDN 2024, the UK's leading data, analytics, and AI event, is less than a week away – promising two days filled with ground-breaking stories, expert insights, and endless innovation.Taking place at the Kensington Olympia in London on September 18-19, this year's event features fifteen theatres and over 300 expert speakers sharing insights on some of the industry's hottest topics – from generative AI to data analytics and privacy. With the event less than a week away, EM360Tech's Head of Podcast Production, Paulina Rios Maya, grabbed Big Data LDN's Event Director, Andy Steed, for a chat about his expectations for this year's event and its growing importance in the data world.In the episode, they discuss: The exciting themes or breakthroughs attendees can expect to see showcased this yearHow Big Data London remains relevant in such a rapidly evolving fieldThe unique networking opportunities or interactive experiences attendees have at the conferenceThe standout sessions or keynote speakers at the conferenceChapters:00:00: Introduction to Big Data LDN 202401:35: Showcasing Data Stories, Transformations, and Challenges02:33: The Networking Opportunities with Industry Leaders and Peers at Big Data LDN 202405:01: Staying Relevant with a Focus on Generative AI and Real-World Use Cases06:55:The Importance of Data Events for Community Building and LearningAbout Big Data LDN 2024Big Data London is the UK's largest data and analytics event, attracting over 16,500 visitors each year. Taking place at the Olympia in London on September 18-19, this year's event features fifteen theatres and over 300 expert speakers across the two-day conference. Attendees can meet face-to-face with tech providers and consultants to find solutions to your data challenges and view the latest product releases and software demos to enhance your business' data capabilities.It's also a great opportunity for attendees to strengthen their business network with new and existing partners, and immerse themselves within the data community and network with speakers, colleagues and practicioners all in 2 days at Big Data LDN.
Sustainable sourcing is essential for businesses committed to environmental and social responsibility, but achieving it requires accurate and reliable data. Master Data Management (MDM) ensures that all sourcing data—such as supplier information, certifications, and compliance records—is consistent and up-to-date. This enables organisations to make informed decisions that align with their sustainability goals, reduce waste, and promote ethical practices throughout their supply chain.MDM is the foundation of a successful sustainability strategy. By providing a single source of truth for all critical data, MDM helps businesses monitor and track their sustainability efforts effectively. With accurate data, companies can identify opportunities to improve resource efficiency, reduce carbon footprints, and ensure compliance with environmental standards, ultimately leading to a more sustainable and resilient business model.In this episode, George Firican, Founder of LightsOnData, speaks to Matthew Cawsey, Director of Product Marketing and Solution Strategy, and Paarijat Bose, Customer Success Manager at Stibo Systems, to discuss sustainable sourcing and why accurate data matters. Key Takeaways:Sustainable sourcing involves understanding the provenance and environmental impact of products, ensuring compliance with regulations, and meeting sustainability goals.Data completeness and accuracy are crucial in meeting regulatory requirements and avoiding issues like greenwashing.Managing sustainability data requires a solid foundation of MDM to ensure data accuracy, stewardship, and semantic consistency.MDM solutions help companies collect, manage, and share sustainability data, enabling them to meet compliance requirements and achieve their sustainability goals.Chapters:00:00 - Introduction and Overview01:07 - The Challenge of Collecting Data for Compliance and Reporting02:31 - Data Accuracy and Completeness in the Supply Chain05:23 - Regulations and the Demand for Transparent and Complete Data08:41 - The Role of Master Data Management in Sustainability15:51 - How Data Management Technology Solutions Help Achieve Sustainability Goals21:02 - The Need to Start Early and Engage with Data Management Solutions22:01 - Conclusion and Call to Action
Data provenance is essential for maintaining trust and integrity in data management. It involves tracking the origin of data and understanding how it has been processed and handled over time. By focusing on fundamental principles such as identity, timestamps, and the content of the data, organisations can ensure that their data remains accurate, consistent, and reliable.Implementing data provenance does not require significant changes or large investments. Existing technologies and techniques can be seamlessly integrated to provide greater transparency and control over data. With data provenance, businesses can confidently manage their data, enhancing decision-making and fostering stakeholder trust.In this episode, Jon Geater, Co-Chair of the Supply Chain Integrity Transparency and Trust (SCITT) Working Group, speaks to Paulina Rios Maya, Head of Industry Relations, about data provenance. Key Takeaways: Data provenance is knowing where data comes from and how it has been handled, ensuring trust and integrity.The fundamental principles of data provenance include identity, timestamps, and the content of the data.Data provenance can be implemented by integrating existing technologies and techniques without significant changes or investments.Data provenance helps with compliance, such as GDPR, by providing a transparent record of data handling and demonstrating compliance with requests.Chapters: 00:00 - Introduction and Background02:01 - Understanding Data Provenance05:47 - Implementing Data Provenance10:01 - Data Provenance and Compliance13:50 - Success Stories and Industry Applications18:10 - Conclusion and Call to Action
FME is a vital tool in disaster management and response. It enables the integration and transformation of geospatial data for real-time tracking of disasters and hazards. By ensuring accurate and timely data analysis, it provides essential decision support for disaster management professionals.During the Maui wildfires, FME and the Pacific Disaster Centre were crucial in managing and analysing critical data, allowing for effective coordination and response. By facilitating seamless data sharing and collaboration among stakeholders, FME helps ensure that the correct information reaches the right people at the right time.In this episode of the EM360 Podcast, Alejandro Leal, an Analyst at KuppingerCole, speaks to Jorma Rodieck, a GIS Specialist at the Pacific Disaster Centre, about the importance of FME. Key Takeaways:FME is an essential tool in disaster management and response, allowing for the integration and transformation of geospatial data.FME enables real-time data analysis and decision support for disaster management professionals.During the Maui wildfires, FME was instrumental in managing and analyzing critical data, providing a common operating picture for response efforts.FME ensures effective data sharing and collaboration among various stakeholders, enabling smooth interoperability between departments and agencies.Chapters:00:00 - Introduction and Background02:35 - The Role of FME in Disaster Management06:44 - Managing and Analyzing Critical Data with FME10:34 - FME's Impact during the Maui Wildfires11:59 - Ensuring Effective Data Sharing and Collaboration15:20 - The Future of FME in the Pacific Disaster Center18:15 - Conclusion
Open source real-time analytics offers unparalleled advantages, providing businesses with freedom and independence to maintain operations seamlessly, even if a vendor issue arises. However, the journey isn't without its challenges. Open source solutions can often be clunky and require specialised expertise to manage effectively. This is where DoubleCloud comes in, offering a managed platform that addresses these obstacles by handling crucial responsibilities such as backups, high availability, and security updates, allowing businesses to focus on leveraging their data.In this podcast, Christina Stathopoulos speaks to Vladimir Borodin, Co-Founder and CEO of DoubleCloud, about open source strategies and the advantages of the DoubleCloud solution. Key Takeaways:DoubleCloud's managed platform helps overcome the challenges of open source, such as clunkiness and a lack of expertise.Successful customer use cases demonstrate the performance and cost benefits of DoubleCloud's solution.The transition phase to DoubleCloud's solution depends on the complexity of the application.Using open source whenever possible is recommended.Chapters:00:00 - Introduction and Background02:29 - The Advantages of Open Source04:21 - Challenges of Open Source06:47 - The Power of Real-Time Analytics09:11 - Success Stories: Improved Performance and Reduced Costs12:54 - Navigating the Transition to DoubleCloud's Solution15:14 - The Importance of Using Open Source
Privacy by Default and Design is a fundamental principle of the General Data Protection Regulation (GDPR). It prioritises transparency, user control, and data security from the outset. This approach ensures that privacy is integrated into systems and processes by default rather than as an afterthought. By embedding these practices, organisations enhance trust and accountability while meeting regulatory requirements. However, challenges such as resistance to change and the need for cultural transformation must be addressed to implement this principle effectively.In this episode of the Don't Panic It's Just Data, Tudor Galos, Senior Privacy Consultant, speaks to Paulina Rios Maya, Head of Industry Relations, about the impact of privacy by default and design extend to user experience, where issues like consent fatigue and the necessity for user-friendly interfaces arise. Key Takeaways:Organisations face challenges in implementing privacy by default and design, including resistance to change and the need for cultural transformation.Privacy by default and design impact user experience, with issues like consent fatigue and the need for user-friendly interfaces.Regulations like GDPR and CCPA incorporate privacy by default and design principles, emphasising compliance and accountability.Chapters:00:00 - Introduction and Overview01:00 - Core Principles of Privacy by Default and Design02:19 - Difference from Traditional Privacy Practices04:09 - Challenges in Implementing Privacy by Default and Design05:33 - Impact of Privacy by Default on User Experience08:14 - Alignment of Privacy by Default with Regulations09:04 - Ensuring Compliance and Trust11:24 - Implications of Emerging Technologies on Privacy13:15 - Innovations in Privacy-Enhancing Technologies15:50 - Conclusion
Safe Software's Feature Manipulation Engine (FME) plays a pivotal role in the City of Fremont's operations, particularly in ensuring accurate and efficient data submissions under the Racial and Identity Profiling Act (RIPA). By automating complex workflows and enhancing data quality, FME not only ensures seamless compliance with RIPA requirements but also optimises processes for their ITS and GIS divisions.FME also drives innovation in projects like the DroneSense programme and their Cityworks asset management integration. With seamless data integration and powerful visualisations, FME empowers the City of Fremont to enhance operations, improve asset management, and support informed decision-making.In this episode, Jonathan Reichental, founder at Human Future, speaks to John Leon, GIS Manager for the City of Fremont, to discuss: FME RIPAPublic Safety Chapters:00:00 - Introduction and Overview of the City of Fremont and IT/GIS Division03:01 - Explanation of the Racial and Identity Profiling Act (RIPA)04:27 - Challenges in Meeting RIPA Standards and Utilizing FME06:21 - How FME Ensures Error-Free RIPA Data Submissions09:40 - Benefits of Using FME for RIPA Compliance10:39 - Other Innovative Projects Utilizing FME in the City of Fremont13:30 - Future Plans for FME in the City of Fremont17:17 - Recommendations for Government Agencies: Leverage FME for Data Submissions
Real-time data insights help identify performance bottlenecks, manage data efficiently, and drive innovation. Despite the growing need for these capabilities, organisations often face challenges in implementing effective real-time analytics. Achieving high-concurrency data processing is crucial for overcoming performance bottlenecks in real-time analytics. Embracing real-time analytics is not just a necessity, but a way to transform your data into actionable insights, optimise performance, and fuel business growth.Yellowbrick is a modern data platform built on Kubernetes for enterprise data warehousing, ad-hoc and streaming analytics, AI and BI workloads that ensures comprehensive data security, unparalleled flexibility, and high performance. In this podcast, Doug Laney, a Data Strategy Innovation Fellow with West Monroe, speaks to Mark Cusack, the CTO of Yellowbrick, about the power of real-time analytics. Key Takeaways:Real-time analytics enables faster business decisions based on up-to-date data and focuses on enabling actions.Using a SQL data platform like Yellowbrick, designed for high-concurrency data processing, can address performance bottlenecks in real-time analytics.Chapters:00:00 - Introduction and Overview01:07 - The Benefits of Real-Time Analytics06:23 - Overcoming Challenges in Implementing Real-Time Analytics06:51 - High Concurrency Data Processing for Real-Time Analytics13:59 - Yellowbrick: A Secure and Efficient SQL Data Platform
Accurate and reliable data is essential for training effective AI models. High-quality data ensures precision, reduces bias, and builds trust in AI systems. Similarly, Master Data Management (MDM) systems enhance data quality by integrating data from multiple sources, enforcing data governance, and providing a single source of truth. This helps eliminate discrepancies and maintain data integrity.Integrating Product Information Management (PIM) with MDM ensures accurate and consistent product data across all channels, crucial for data-driven marketing. This combination centralises customer and product data, enabling precise targeting and personalised experiences. MDM and PIM integration leads to higher ROI and improved customer satisfaction by supporting effective marketing strategies.In this episode of the EM360 Podcast, Paulina Rios Maya speaks to Philipp Krueger about integrating PIM and MDM functionalities and how it streamlines operations, improves data accuracy and supports data-driven marketing strategies. Chapters00:00 - Introduction and Importance of Data Quality in AI Models05:27 - Core Capabilities of an MDM System08:13 - The Role of Data Governance in Data Management13:37 - Enhancing Customer Experience and Driving Sales with Pimcore19:47 - Integration of PIM and MDM Functionalities for Data-Driven Marketing Strategies22:59 - The Impact of Accurate Data on Revenue Growth27:28 - Simplifying Data Management with a Single Platform
One of the biggest challenges businesses face when it comes to data visualisation is handling the volume of data and the need for faster processing methods. There's a common misconception that effective data visualisation must be fancy and interactive, but simple visuals can be just as powerful. Ann K. Emery, an expert in the field, believes that accessibility doesn't have to be time-consuming or expensive. In this podcast, she shares actionable strategies for creating accessible visualizations with Paulina Rios Maya, Head of Industry Relations at EM360Tech. Key TakeawaysAvoiding red-green colour combinations, Ensuring proper colour contrastUsing direct labelling instead of legends. Avoiding using all-capsUsing grey to highlight important informationEmploying small multiples to simplify complex visualisations. Chapters: 00:00 - Introduction00:54 - Defining Accessibility in Data Visualization02:17 - Big A Accessibility Tips06:36 - Little a Accessibility Strategies12:28 - The Future of Data Accessibility
Managing cloud costs effectively has become a significant challenge for organisations relying on public cloud services. FinOps addresses these challenges by ensuring efficient spending and governance of cloud resources. Key practices in FinOps include achieving complete visibility into cloud usage and costs, fostering cross-functional collaboration between finance, operations, and engineering teams, and utilising data-driven decision-making to optimise cloud investments. By embracing a centralised team, organisations can instil a culture of governance and efficiency in cloud cost management. This approach can lead to enhanced resource utilisation and substantial cost savings. With Vantage, your organisation can cultivate a robust cloud cost governance and efficiency culture, ensuring your cloud investments yield maximum value.In this episode of the EM360 Podcast, Kevin Petrie, VP of research at BARC US, speaks to Ben Schaechter, CEO and co-founder of Vantage, to discuss: FinOpsVantage's platform Cloud costs and FinOps practicesChapters00:00 - Introduction and Overview02:02 - Understanding FinOps and Cloud Cost Governance07:45 - Best Practices in FinOps: Centralization and Collaboration13:50 - The Role of Data-Driven Insights in Optimizing Cloud Costs
Managing large volumes of data in the context of AI and machine learning applications presents challenges related to data quality, data preparation, and automation. The requirements of data management are changing with the advent of generative AI, requiring more flexibility and the ability to handle larger volumes of data. Pimcore leverages AI and machine learning to automate data utilization and improve data intelligence. By streamlining data management and integrating various data sources, Pimcore drives revenue growth for its customers. The platform combines data management and experience management to deliver personalized data across communication channels. Pimcore's MDM solution addresses the challenges of integrating data for both human and machine consumption. The choice between physical and virtual MDM hubs depends on the use case and industry. In this episode of the EM360 Podcast, Doug Laney, Data and Analytics Strategy Innovation Fellow at West Monroe speaks to Dietmar Rietsch, Managing Director and Co-Founder of Pimcore, to discuss: Data managementAIMachine learningData quality
Maximising data relationships through text analytics, particularly with tools like LLMS and Knowledge Graphs, offers organisations unprecedented insights and capabilities. By leveraging these advanced technologies, businesses can unlock hidden connections and patterns within their data, leading to more informed decision-making and strategic planning. Integrating Ontotext's solutions is a game-changer, empowering organisations to extract, organise, and visualise complex information from unstructured data sources. With Ontotext's expertise in semantic technology, businesses can construct robust knowledge graphs that offer a comprehensive understanding of their data landscape. This comprehensive approach not only facilitates better analysis and interpretation of data but also ignites innovation and propels business growth in today's increasingly data-driven world.In this episode of the EM360 Podcast, Paulina Rios Maya, Head of Industry Relations, speaks to Doug Kimball, Chief Marketing Officer at Ontotext, to discuss: AI in Enterprise Knowledge LLMs Knowledge Graphs Chapters00:00 - Challenges of Integrating LLMs into Enterprise Knowledge Management Systems04:35 - Enhancing Compatibility and Efficacy with Knowledge Graphs07:21 - Innovative Strategies for Integrating LLMs into Knowledge Management Frameworks11:07 - The Future of LLM-Driven Knowledge Management Systems: Intelligent Question Answering and Insight Enablement
Managing cloud computing costs is a pressing challenge faced by organisations of all sizes across industries. As businesses increasingly migrate their operations to the cloud, the complexity of managing and optimizing costs grows exponentially. Without proper oversight and strategy, cloud expenses can quickly spiral out of control, leading to budget overruns and financial inefficiencies. Vantage addresses this issue head-on by providing organizations with a powerful platform equipped with automated cost recommendations, customizable reports, and real-time monitoring capabilities. By leveraging advanced analytics and machine learning, Vantage empowers teams to gain unparalleled visibility into their cloud spending and make informed decisions to optimize costs. In this episode of the EM360 Podcast, Dana Gardner, Principal Analyst at Interarbor Solutions speaks to Ben Schaechter, CEO and Co-founder of Vantage, to discuss: Cloud cost managementFinOpsCost optimizationAutomated cost recommendations
Ensuring the reliability and effectiveness of AI systems remains a significant challenge. Generative AI must be combined with access to your company data in most use cases, a process called retrieval-augmented generation (RAG). The results from GenerativeAI are vastly improved when the model is enhanced with contextual data from your organization. Most practitioners rely on vector embeddings to surface content based on semantic similarity. While this can be a great step forward, achieving good quality requires a combination of multiple vectors with text and structured data, using machine learning to make final decisions. Vespa.ai, a leading player in the field, enables solutions that do this while keeping latencies suitable for end users, at any scale. In this episode of the EM360 Podcast, Kevin Petrie, VP of research at BARC US speaks to Jon Bratseth, CEO of Vespa.ai, to discuss: the opportunity for Generative AI in businesswhy you need more than vectors to achieve high quality in real systemshow to create high-quality GenerativeAI solutions at an enterprise scale
Geographic Information Systems (GIS) have transformed urban landscape analysis and government policy creation, albeit not without challenges. In the past, GIS analysts often visited locations to piece together information physically.With the help of cutting-edge platforms like Safe Software's FME, cities like Burnaby, British Columbia, have revolutionised their operations. This has led to a significant enhancement in the quality of life for its residents. From predictive modelling to real-time data analysis, the potential for innovation appears boundless, underscoring the importance of GIS technology in improving urban operations.In this episode of the EM360 Podcast, Wayne Eckerson speaks to Herman Louie, GIS Analyst at the City of Burnaby, to discuss: Design and implementation of GIS solutionsSafe Software's FME platform Transition to NG9-1-1 The future of GIS
Government organisations face a multitude of challenges when it comes to managing their data effectively. From interoperability issues between systems to the need for seamless collaboration across agencies, the complexity can be overwhelming. Safe Software's FME platform offers a comprehensive solution to these challenges by providing a flexible and intuitive data integration platform tailored to the unique needs of government agencies.With FME, government organisations can overcome the barriers that hinder efficient data management. FME enables streamlined operations and improved decision-making processes by seamlessly connecting disparate systems and applications. Whether it's digital plan submissions, emergency services coordination, or interagency health data sharing, FME empowers government agencies to achieve their data integration goals with ease.In this episode of the EM360 Podcast, Doug Laney, Data and Analytics Strategy Innovation Fellow at West Monroe speaks to Tom Seymour, Government Sales Team Lead at Safe Software, to discuss: Data integration and interoperability Safe Software's FME platform FME in governments Advantages of FME ROI with FME
Ever wonder how search engines understand the difference between "apple," the fruit, and the tech company? It's all thanks to knowledge graphs! These robust and scalable databases map real-world entities and link them together based on their relationships. Imagine a giant web of information where everything is connected and easy to find. Knowledge graphs are revolutionizing how computers understand and process information, making it richer and more relevant to our needs. Ontotext is a leading provider of knowledge graph technology, offering a powerful platform to build, manage, and utilise knowledge graphs for your specific needs. Whether you're looking to enhance search capabilities, improve data analysis, or unlock new insights, Ontotext can help you leverage the power of connected information.In this episode of the EM360 Podcast, George Firican, Founder of LightsOnData, speaks to Sumit Pal, Strategic Technology Director at Ontotext, to discuss: Knowledge Graphs Use Cases Ontotext GraphDBIntegration of AI Industry best practices
The traditional data warehousing landscape is changing. The concept of private data cloud offers a compelling alternative to both cloud PaaS and traditional data warehousing. Imagine a secure, dedicated environment for your data, existing entirely within your organisation's control.Yellowbrick, a leader in private data cloud solutions, empowers businesses to leverage their data on their terms. Their Bring Your Own Cloud (BYOC) approach offers unmatched flexibility and control. You can deploy Yellowbrick anywhere your data needs to be—public cloud, private cloud, or even the network edge. This ensures compliance with regulations and keeps your data exactly where you want it and can bring down costs. In this episode of the EM360 Podcast, Wayne Eckerson, President of Eckerson Group, speaks to Mark Cusack, Chief Technology Officer of Yellowbrick, to discuss:The need for hybrid multi-cloud data platforms How Yellowbrick differentiatesThe future of private data cloudWhy Yellowbrick?
The data analysis landscape is on the precipice of a paradigm shift. Generative AI (GenAI) promises revolutionary insights, but traditional systems struggle to feed its insatiable appetite for well-structured data. Imagine GenAI as a high-powered engine – it needs meticulously organised fuel to reach its full potential.In this episode of the EM360 Podcast, Analyst Christina Stathopoulos guides Deborah Leff (CRO at SQream) and Jason Hardy (CTO, AI at Hitachi Vantara) as they dissect:The Bottlenecks of Traditional SystemsUnlocking GPU PotentialFueling Insights with Structured DataFaster Time to InsightsThis episode goes beyond theory, exploring real-world examples (like a company querying a staggering 64 quadrillion rows) It showcases the potential for SQream and Hitachi Vantara to empower organisations to make data-driven decisions with unprecedented speed and accuracy.
Cloud-native is the new gold rush for businesses seeking speed, efficiency, and innovation. But are you getting the most out of your investment? Legacy troubleshooting and observability tools can be hidden anchors, dragging down your developers' productivity.The result? You're not reaping the full benefits of cloud-native, and your competitors are leaving you in the dust. Chronosphere, a leader in modern observability, can empower your developers and unlock the true potential of cloud-native. Buckle up and get ready to discover how observability can become your secret weapon for unleashing developer agility and innovation.In this episode of the EM360 Podcast, VP of Research at Eckerson Group, Kevin Petrie speaks to Ian Smith, Field CTO at Chronosphere, to discuss: Observability Cloud-native software developersDeveloper inefficiencyThe Chronosphere solutionData pipeline best practices
Intelligent document processing (IDP) is a technology-driven approach that automates document processing and extraction of valuable information.While handling structured data is considerably straightforward, processing and analyzing unstructured data is laborious. IDP equips users with the ability to process a multitude of document types, including PDFs, spreadsheets, and Word documents, among others. IDP platforms offer a powerful solution that streamlines data extraction from these documents by eliminating the need for any manual intervention. The extracted data, when integrated, enables you to make reliable decisions and improve business efficiency.In this episode of the EM360 Podcast, Analyst Christina Stathopoulos speaks to Jay Mishra, COO at Astera, to discuss:Traditional document processing and the problems it causesAI-powered solutionsThe future of document processing and data extractionFind out more about Astera's unified, no-code data management platform here.
Rainfall. Temperature. Humidity. Natural disasters. Human methods of reading and predicting weather can be tracked all the way back to native tribes and ancient civilisations - and it's still prevalent in the modern world today. Whether (no pun intended) it's an agricultural organisation looking to leverage precipitation data for planting schedules, energy companies looking at temperature trends to predict energy consumption patterns or transportation outfits looking to avoid delays, mastering weather data can really help modern companies to protect themselves against the unknown. In this episode of the EM360 Podcast, Analyst Susan Walsh speaks to Christian Schluchter, CTO at Meteomatics, to discuss: Key use cases of weather dataMastering data access and predictive analyticsThe importance of being data driven
Maximize business value with data products — they incorporate essential data and related capabilities to meet key business objectives. Data products contain datasets, related metadata and a wide range of functionality to understand if data is fit for use. But unlike relying on a raw dataset or data pipeline to generate value, data products deliver a comprehensive, packaged solution that enables data users to achieve their data-driven goals with easier accessibility.In this EM360 Podcast episode, Head of Content Matt Harris sits down with Nathan Turajski, Senior Director of Product Marketing at Informatica. They explore how to:Drive better business outcomes with data productsUnderstand the benefits of user-friendly and efficient consumptionImprove AI, analytics and customer experience initiatives
Data access is a critical aspect of how businesses manage and leverage their data to drive decision-making. With large volumes of data from various sources, it's important that enterprises have a strategy in place to make the accessing process faster and more efficient while remaining compliant, secure and accurate. In this episode of the EM360 Podcast, Christina Stathopoulos, Founder at Dare to Data speaks to Anthony Cosgrove MBE, Co-Founder at Harbr, to discuss:Data access challenges and pitfallsFacilitating business use cases with improved data accessAdvice for CDOs and data professionals
The data lakehouse has been quickly gaining popularity within the data management and analytics space. Combining elements of data lakes and data warehouses, the data lakehouse aims to address the challenges associated with both in a way which helps companies reach their data and business goals. In this episode of the EM360 Podcast, Christina Stathopoulos speaks to Read Maloney, CMO at Dremio, as they discuss:The state of the data lakehouseHow an effective lakehouse strategy can be the key to digital transformationHow data lakehouses empower business
Telemetry data pipelines are designed to collect, process and transmit telemetry data from various sources towards a place where businesses can store, analyse and utilise it for better decision-making. Important for SREs, DevOps, ITOps, SecOps, DataOps and the wider tech professional, taking control of your telemetry data to address data challenges in a compliant and secure way should be a common goal for the modern business. In this episode of the EM360 Podcast, Analyst Kevin Petrie speaks to Tucker Callaway, CEO at Mezmo, to discuss:Understanding telemetry data pipelinesAddressing data challenges with telemetry dataOptimising performance and enhancing security with telemetry data
In an era dominated by the relentless surge of information, data has become the lifeblood of modern businesses and organisations. As the volume, variety, and velocity of data continue to escalate at an unprecedented pace, the role of data leaders has become increasingly pivotal. These architects of the digital realm are entrusted with the formidable task of not only harnessing the power of data but also steering their organizations through the complex and ever-evolving data landscape.In this episode of the EM360 Podcast, Head of Content Matt Harris speaks to Niamh O'Brien, Senior Manager of Solution Architecture at Fivetran to discuss:Challenges faced by data leaders todayBecoming GenAI readyThe big opportunity for CDOs and their teams
Having a 360-degree view of your business means having a comprehensive and holistic understanding of every aspect of your organisation. It involves collecting and integrating data from various sources and departments to create a unified, real-time view. This panoramic perspective enables businesses to make informed decisions, identify opportunities, and address challenges across the entire enterprise.In this episode of the EM360 Podcast, Head of Content Matt Harris speaks to Prash Chandramohan, Senior Director of Product Marketing at Informatica. They discuss:Having a 360-degree view of your businessDriving great customer experiencesIs your data ready for AI?
Web data extraction. The ability to transform any website into structured data.Using a residential proxy network, companies have been able to fine-tune their data collection for a range of use cases - from threat intelligence and email protection to ad verification and market research. But why do companies need this now? How do the different stacks of data extraction work and how can you make sure which one works best for you? And how is AI being used in this technology to reach industry-leading scalability and reliability?In today's episode of the EM360 Podcast, Head of Content Matt Harris speaks to Eitan Bremler, VP of Products at NetNut, to discuss:Data extraction and why its a problem for companiesThe different stacks of data extractionHow AI/ML should fit in to your data strategy
Product data that is clean, consistent and enriched with accurate information and descriptions gets results.When your customers can easily find the products they're looking for you can expect sales to increase, product returns to all but disappear, and brand loyalty to continue to grow. The problem is, how do you consistently deliver this level of quality in a world of disparate data sources, data silos, and the burden of existing technical debt?AI is a powerful tool that can strategically help this industry unlock the true value of their data and meet the increasing demand for better data. By automating data cleaning, enhancing data quality, and generating actionable insights, organizations can deliver more meaningful experiences to their customers while eliminating the clutter of irrelevant data. This technology accelerates the industry's journey towards data-driven innovation and customer-centricity, ultimately giving trust to the end consumer.In today's episode of the EM360 Podcast, Analyst Susan Walsh speaks to Sam Russo, Practice Director of Automotive & Heavy Duty at Pivotree as they discuss:Why “perfect” product data is so importantIssues in the automotive industryWhat AI/ML means for those modelling and enriching product data