This is DAMA Norway's podcast to create an arena for sharing experiences within Data Management, showcase competence and level of knowledge in this field in the Nordics, get in touch with professionals, spread the word about Data Management and not least promote the profession Data Management.-----------------------------------Dette er DAMA Norge sin podcast for å skape en arena for deling av erfaringer med Data Management​, vise frem kompetanse og kunnskapsnivå innen fagfeltet i Norden​, komme i kontakt med fagpersoner​, spre ordet om Data Management og ikke minst fremme profesjonen Data Management​.
Winfried Adalbert Etzel - DAMA Norway
«I consider this Data Governance as a cure. (…) Data Governance can make things better.»In this clarifying conversation, Finnish data expert Säde Haveri shares her 18 years of experience and introduces a practical framework consisting of five key elements that can guide any organization's data governance journey.Säde, who is a Data Governance Manager at Relax Solutions and co-founder of Helsinki Data Week, first explains the important difference between a framework and a playbook. While many consultants offer ready-made solutions, Säde argues that a truly effective framework functions more like scaffolding, helping organizations uncover their own best path forward.We dive deep into the five elements: the choice between a top-down or bottom-up approach, the balance between defensive and offensive strategies, how to define the right scope, identifying key stakeholders, and the strategic role of external consultants. Säde illustrates how these decisions affect the structure, implementation, and success of data governance, with practical examples from his own experience.Here are our hosts key takeaways:Data Governance is at the heart of the socio-technical system - it requires a variety of skills.The experience for the end user has not change much in the last almost 20 years.There is a need for «group support» for data people.What is a framework?There are ambiguous connotations of the word «framework».A framework is not a playbook.A framework describes the what, not the how. You need two adjust it to your reality.Think of a framework as non-prescriptive.Frameworks are related to best practices, but they are not the same thing.Use it to identify your strengths and build your data governance practices around those.Top-down or bottom-upCan be a management awakening (e.g. GDPR), or a need for better data at a practitioner level that initiates the need for data governance.Top-down often materializes in conceptual approaches.You start at a conceptual level, you will create data governance roles around these conceptual entities.From a bottom-up perspective you are building governance around your tables or datasets.As a middle way, you can focus on data products as objects to build governance around.Aligning strategy defensively or offensively Defensive vs. Offensive strategy - based on an article from Davenport 2017.There is no one-size fits all.You need to understand your motivation? Is it build due to risk mitigation needs or for business value creation?You need to understand your sector-driven differences.Look at this as a spectrum, where your approach can differ between offensive and defensive based on the criticality of the dat for the use case you are working with.You always have to show your value, the value needs to be measurable.AI ready data can be both offensive and defensive.Identifying Scope & key stakeholdersIdentify your stakeholders and scope based on the strategic alignment on offensive vs. defensive.Use business stakeholders rather than IT, to gain a better understanding of the underlying problem.Data Governance is rather a business value enabler than a cost saving activity.Determining the role of external consultants.You have to sell your solution. Selling is something you cannot outsource.If you are looking at tooling, consider if you can find consultants with the right knowledge and capabilities.Try to understand what experiences the consultants can bring to the table.Ensure that you are aligned on a methodological basis.
«Sometimes it feels like you have CIOs going on Julia Robe's in Pretty Woman spending sprees.»Have you ever wondered why data governance often becomes so complicated that no one really understands what it's about?In this episode, we take a refreshing deep dive with data expert Rasmus Bang, who shows how to make data governance simple and relevant.We explore the delicate art of engaging middle management, which is often the key to successful implementation of data governance. Through data governance committees and a focus on concrete business challenges, you can create both transparency and accountability that drive real change. Rasmus also explains how to bridge the gap between process excellence and data governance—how these disciplines can reinforce each other rather than compete.Here are our hosts' key takeaways:The level of regulation in your industry determines your approach to. Data governance.Spent time in exploring what is necessary and relevant from a business point of view.Identify pinpoints you have today, to tackle them beyond compliance. This is where you show your value to the business.Storytelling is essential in data governance.Be able to see and communicate how your work impacts the business.Data governance is a peoples game.How to start your initiative?Do analysis to translate your data pain points to business pain points.Assess the size and complexity of the challenge ahead.Adjust your lingo to make it understandable for the business.Dont be dogmatic - use existing structures.You need to adapt to your environment instead of hoping that your environment adapts to you.Make sure you find a strategy to talk to the conflicting priorities of middle management.Watch out for the experienced professionals- they can become bottlenecks.Make the value proposition understandable for everyone, also talking to these priorities on middle management.How can my department become better, more efficient, etc.?Include middle management representatives activity around the data itself.Process ExcellenceProcesses are seen as a precondition, while they are actually a result of the organizations culture, organization and people.Process owners often have a broad network in the organization and are therefore important to make data governance work.«If you do data governance right you get de facto business process ownership.» - you get the right people in place.Processes can give you a repeatable structure that is foundational for your success in e.g. AI.Have an approach and build a model that your business stakeholders can recognize and see themselves in.Dont focus too much on theory. Focus on what matters for your business and for your stakeholders. It needs to resonate.Dont try to fix a complex problem with simple solutions.«There is a difference between quick fixes and fixing small things.»
«You bring in the knowledge of what works in real life and what doesn't. That is actually what you are being paid for.»With a year behind him as a solo entrepreneur in his own company, Datakor Consulting, Juha Korpela takes us on a journey through fact-finding-missions at what he calls "the middle layer" of organizations — the strategic area between high-level business strategy and tactical project execution. It is here, he believes, that data consultants can create the most significant and lasting value.We discuss the pitfalls of standardized frameworks and "blueprint" approaches offered by many consulting firms, and why tailored solutions based on a deep understanding of organizational culture always yield better results. Juha shares his methods for knowledge transfer that ensure organizations can continue succeeding with their data work long after the consultant has left the project.Here are Winfried´s key takeaways:SkillsThe key skill as a data consultant, no matter if on a strategic or solution, project level is to understand «what the customer really needs.»The key skills are: Listening. Active listening is the key to understanding.Create mental models: when talking to stakeholder you need to be able to put the information you capture together in a mental model.Understanding.Tech comes after.Working with data modeling is about listening to stories about how business work. Understanding business processes are key.Understanding stories about business and what is relevant for data modeling is a skill that everyone can profit from, but that is seldom taught.Data Modeling is a fact-finding-mission.It is about understanding what the organization does, how it does things, and where this could be improved.ImpactA data consultants impact is dependent on the organization, the structure, and the level of maturity.If there is a CDO or CIO to connect to it can be a good way to create results and visibility.Also as a data consultant it is important find a place in the organization where you have shared views and understanding.If you begin bottom-up you need to be ready to sell this upwards in the organization.LimitsConsultants can help with the initial projects to get you started.Consultants can help figuring out processes and operating model and design what is needed.Organizations need to create long-term ownership in house.Running and maintaining needs to fit with the organizations culture, its s structure, needs, maturity, etc.Models, blueprints, frameworks that you get from the outside can get you started, but do not work in the long run.PatternsData Consultants can see certain patterns emerging across an industry.That knowledge on patterns, lessons learnt, experiences is valuable to apply.That knowledge you bring in is what defines your value, more than specific skills.It is easy for people in organizations to get stuck. Consultants can help as a fresh wind.Knowledge transferAs a consultant you bring in new knowledge, and you need to account for that organizations want to transfer that knowledge to internals.Find ways to create custom training packages to facilitate knowledge sharing.You aim for the organization to succeed with their work, also after the consultants are gone.Consultant aaSDo we move from being consultants to becoming a service offering?Service models can crate a distance between consultants and clients.You need to have a clear understanding the impact of models that include ownership and responsibility transfer as eg. Outsourcing operational tasks.
"Dataetik handler også om den måde, vi opfatter brugeren og mennesket, vores demokrati og vores samfund på." / "Data ethics is also about how we perceive the user and the human being, our democracy, and our society."In this episode, we dive into the complexities of data ethics with Gry Hasselbalch, a leading expert on the topic. With experience shaping EU regulations on data and AI ethics, she shares insights on why human values must remain at the core of digital development.We explore the principle of “humans at the center” and why people should be seen as more than just data points or system users. Gry discusses how artificial intelligence and big data challenge this idea and why human interests must take priority over commercial or institutional goals.Here are our hosts' key takeaways:HumansWhen we talk about data ethics we need to relate to a value set - in out case a European value set, based on human rights.Data Ethics is built around humans - a human-centric principle. That means that human interests are always prioritized, above organizational interests, commercial interests, or machine interests.User is not enough if we talk about human in the center: this will mean different things once the discussion includes AI.We need to talk about the whole human, not just the user or the data about the human.Systems have an influence on our life, and therefore the human needs to be seen as a holistic being.RegulationsEU is seen as a «regulatory superpower» that has an ethical starting point when regulating.All cultures will have different interpretation and starting point of what ethics means.But through history we have been able to agree on an ethical baseline, like the charts of human rights.Human dignity is a central part of what ethics mean internationally.Regulation is not everything - remember that regulation happens due to an identified need.Regulations and laws are a guideline, but they do not cover (and cannot cover) the entire topic of data ethics.To ensure a value based approach to data handling, we need to go beyond regulations - talk about this as a societal challenge.Socio-technicalTechnology is not neutral - it is developed, applied within a certain cultural setting.Technical systems are part of society as much as society is part of the technical systems we develop and use.Maybe we should rather talk about «socio-technical infrastructure».There is a dichotomy in talking about data as something valuable and at the same time as a liability.Data ethics can be viewed as a competitive advantage, a way to induce trust and better an organizations reputation.AI and ethicsAI is accelerating the need for ethical data decisions.AI is not created out of the blue, it is very much based on our data, our societal norms, developed by humans.AI is becoming a solution for «everything» - but what does that nean for human-machine relationship?AI is a tool, not a solution.What interests are pushing AI and what impact does AI have on our social systems and our culture?Data Ethics of Power - A Human Approach in the Big Data and AI EraData Ethics - The New Competitive AdvantageHuman Power - Seven Traits for the Politics of the AI Machine Age
«Leadership is about sowing the common vision and the common way forward, bringing the people with you.»How can a nuclear physicist transform into a data leader in the industrial sector? Kristiina Tiilas from Finland shares her fascinating journey from leading digitalization programs at Fortum to shaping data-driven organizations at companies like Outokumpu and Kemira. Kristiina provides unique insights into navigating complex data-related projects within traditional industrial environments. With a passion for skydiving and family activities, she balances a demanding career with an active lifestyle, making her an inspiring guest in this episode.We focus on the importance of data competence at the executive level and discuss how organizations can strengthen data understanding without a formal CDO role. Kristiina shares her experiences in developing innovative digitalization games that engage employees and promote a data-driven culture. Through concrete examples rather than technical jargon, she demonstrates how complex concepts can be made accessible and understandable. This approach not only provides a competitive advantage but also transforms data into an integral part of the company's decision-making processes.Here are my key takeaways:The AI hype became a wake-up moment for Data professionals in Finland taking the international stage. As a leader in dat you need to balance data domain knowledge and leadership skills. Both are important.Leadership is important to provide an arena for your data people to deliver value.As a leader you are in a position that requires you to find ways of making tacit knowledge explicit. If not you are nit able too use that knowledge to train other people or a model.CDOThe Chief Data Officer is not really present in Nordic organizations.An executive role for data is discussed much, but in reality not that widespread.Without CDO present, you need to train somebody in the top leadership group to voice data.CDO is different in every organization.Is CDO an intermediate role, to emphasis Data Literacy, or a permanent focus?You can achieve a lot through data focus of other CxOs.Make data topics tangible, this is about lingo, narratives, but also about ways of communicating - Kristiina used gamification as a method.Creating a game to explain concepts in very basic terms with clear outcomes and structure can help with Data Literacy for the entire organization.Data in OT vs. ITPredictions and views on production should be able to be vision also in Operational Settings on all levels. There should not be any restriction in utilizing analytical data in operational settings.Security and timeliness are the big differentiators between OT and IT.These are two angles of the same. They need to be connected.IoT (Internet of Things) requires more interoperability.Extracting data has been a one way process. The influence of Reverse ETL on OT data is interesting to explore further.There are possibilities to create data driven feedback loops in operations.Data TeamsIf you start, start with a team of five: One who knows the data (Data Engineering) One who knows the businessOne who understands Analytics / AIOne who understands the users / UXOne to lead the teamYou can improve your capabilities one step at a time - build focus areas that are aligned with business need an overall strategy.If you expect innovation from your data team, you need to decouple them from the operational burden.Show your value in $$$.
"Vi modellerer for å forstå, organisere og strukturere dataene." / "We model to understand, organize, and structure the data."This episode with Geir Myrind, Chief Information Architect, offers a deep dive into the value of data modeling in organizations. We explore how unified models can enhance the value of data analysis across platforms and discuss the technological development trends that have shaped this field. Historical shifts toward more customized systems have also challenged the way we approach data modeling in public agencies such as the Norwegian Tax Administration.Here are my key takeaways:StandardizationStandardization is a starting point to build a foundation, but not something that let you advance beyond best practice.Use standards to agree on ground rules, that can frame our work, make it interoperable.Conceptual modeling is about understanding a domain, its semantics and key concepts, using standards to ensure consistency and support interoperability.Data ModelingModeling is an important method to bridge business and data.More and more these conceptual models gain relevance for people outside data and IT to understand how things relate.Models make it possible to be understood by both humans and machines.If you are too application focused, data will not reach its potential and you will not be able to utilize data models to their full benefits.This application focus which has been prominent in mainstream IT for many years now is probably the reason why data modeling has lost some of its popularity.Tool advancement and new technology can have an impact on Data Management practices.New tools need a certain data readiness, a foundation to create value, e.g. a good metadata foundation.Data Modeling has often been viewed as a bureaucratic process with little flexibility.Agility in Data Modeling is about modeling being an integrated part of the work - be present, involved, addressed.The information architect and data modeling cannot be a secretary to the development process but needs to be involved as an active part in the cross-functional teams.Information needs to be connected across domains and therefore information modeling should be connected to business architecture and process modeling.Modeling tools are too often connected only to the discipline you are modeling within (e.g. different tools for Data vs. Process Modeling).There is substantial value in understanding what information and data is used in which processes and in what way.The greatest potential is within reusability of data, its semantics and the knowledge it represents.The role of Information ArchitectInformation Architects have played a central role for decades.While the role itself is stable it has to face different challenges today.Information is fluctuant and its movement needs to be understood, be it through applications or processes.Whilst modeling is a vital part of the work, Information Architects need to keep a focus on the big picture and the overhauling architecture.Information architects are needed both in projects and within domains.There is a difference between Information and Data Architects. Data Architects focus on the data layer, within the information architecture, much closer to decisions made in IT.The biggest change in skills and competency needs for Information Architects is that they have to navigate a much more complex and interdisciplinary landscape.MetadataData Catalogs typically include components on Metadata Management.We need to define Metadata broader - it includes much more than data about data, but rather data about things.
"Den største utfordringen, det viktigste å ta tak i, det er å standardisere på nasjonalt nivå. / The biggest challenge, the most important thing to address, is standardizing at the national level."The healthcare industry is undergoing a significant transformation, driven by the need to modernize health registries and create a cohesive approach to data governance. At the heart of this transformation is the ambition to harness the power of data to improve decision-making, streamline processes, and enhance patient outcomes. Jørgen Brenne, as a technical project manager, and Marte Kjelvik's team, have been instrumental in navigating the complexities of this change. Their insights shed light on the challenges and opportunities inherent in healthcare data modernization.Here are my key takeaways:Healthcare data and registryIts important to navigate different requirements from different sources of authority.To maintain comprehensive, secure, and well-managed data registries is a challenging task.We need a national standardized language to create a common understanding of health data, what services we offer within healthcare and how they align.Authorities need also to standardize requirements for code and systems.National healthcare data registry needs to be more connected to the healthcare services, to understand data availability and data needs.CompetencyData Governance and Data Management are the foundational needs the registry has recognized.Dimensional Modeling was one of the first classes, they trained their data team on, to ensure this foundational competency.If the technology you choose supports your methodology, your recruitment of new resources becomes easier, since you don't need to get experts on that very methodology.ModelsUser stories are a focus point and prioritized. Data Lineage (How data changed through different systems) is not the same as Data Provenience (Where is the datas origin). You need both to understand business logic and intent of collection) - User stories can help establish that link.Understanding basic concepts and entities accounts for 80% of the work.Conceptual models ensured to not reflect technical elements.These models should be shareable to be a way to explain your services externally.Could first provides an open basis to work from that can be seen as an opportunity.There are many possibilities to ensure security, availability, and discoverability.Digitalization in Norwegian public services has brought forth a set of common components, that agencies are encouraged to use across public administration.Work based on experiences and exchange with others, while ensuring good documentation of processes.Find standardized ways of building logical models, based on Data Contracts.By using global business keys, you can ensure that you gain structured insight into the data that is transmitted.Low Code tools generate generic code, based on the model to ensure effective distribution and storage of that data in the registry.The logical model needs to capture the data needs of the users.Data Vault 2.0 as a modeling tool to process new dats sources and adhering to a logical structure.There is a discipline reference group established to ensure business alignment and verification of the models.Data should be catalogued as soon as it enters the system to capture the accompanying logic.Data VaultAdaptable to change and able to coordinated different sources and methods.It supports change of formats without the need to change code.It makes parallel data processing possible at scale.Yet due to the heterogeneity of data vault, you need some tool to mange.
«Data Management is an interesting one: If it fails, what's the feedback loop?»For the Holiday Special of Season 4, we've invited the author of «Fundamentals of Data Engineering», podcast host of the «Joe Reis Show», «Mixed Model Arts» sensei, and «recovering Data Scientist» Joe Reis. Joe has been a transformative voice in the field of data engineering and beyond. He is also the author of the upcoming book with the working title "Mixed Model Arts", which redefines data modeling for the modern era. This episode covers the evolution of data science, its early promise, and its current challenges. Joe reflects on how the role of the data scientist has been misunderstood and diluted, emphasizing the importance of data engineering as a foundational discipline. We explore why data modeling—a once-vital skill—has fallen by the wayside and why it must be revived to support today's complex data ecosystems. Joe offers insights into the nuances of real-time systems, the significance of data contracts, and the role of governance in creating accountability and fostering collaboration. We also highlight two major book releases: Joe's "Mixed Model Arts", a guide to modernizing data modeling practices, and our host Winfried Etzel's book on federated Data Governance, which outlines practical approaches to governing data in fast-evolving decentralized organizations. Together, these works promise to provide actionable solutions to some of the most pressing challenges in data management today. Join us for a forward-thinking conversation that challenges conventional wisdom and equips you with insights to start rethinking how data is managed, modeled, and governed in your organization.Some key takeaways:Make Data Management tangibleData management is not clear enough to be understood, to have feedback loops, to ensure responsibility to understand what good looks like.Because Data Management is not always clear enough, there is a pressure to make it more tangible.That pressure is also applied to Data Governance, through new roles like Data Governance Engineers, DataGovOps, etc.These roles mash enforcing policies with designing policies.Data ContractsShift Left in Data needs to be understood more clearly, towards a closer understanding and collaboration with source systems.Data Contracts are necessary, but it's no different from interface files in software. It's about understanding behavior and expectations.Data Contracts are not only about controlling, but also about making issues visible.Data GovernanceThink of Data Governance as political parties. Some might be liberal, some more conservative.We need to make Data Governance lean, integrated and collaborative, while at the same time ensuring oversight and accountability.People need a reason to care about governance rules and held accountable.If not Data Governance «(...) ends up being that committee of waste.»The current way Data Governance is done doesn't work. It needs a new look.Enforcing rules, that people don't se ant connection to or ownership within are deemed to fail.We need to view ownership from two perspectives - a legal and a business perspective. They are different.Data ModelingBusiness processes, domains and standards are some of the building blocks for data.Data Modeling should be an intentional act, not something you do on the side.The literature on Data Modeling is old, we are stuck in a table-centric view of the world.
«We want to make data actionable.»Join us for an engaging conversation with Shuang Wu, Mesta's lead data engineer. We delve into the concept of platforms and explore how they empower autonomous delivery teams, making data-driven decisions a central part of their strategy.Shuang discusses the intricate process of evolving from a mere data platform to a comprehensive service platform, especially within organizations that aren't IT-centric. Her insights emphasize a lean, agile approach to prioritize use cases, focusing on quick iterations and prototypes that foster self-service and data democratization. We explore the potential shift towards a decentralized data structure where domain teams leverage data more effectively, driving operational changes and tangible business value in their pursuit of efficiency and impact.My key learnings:It's not just about gaining insights, but also about harmonizing and understanding data in context.Find your SMEs and involve them closely - you need insight knowledge about the data and pair that with engineering capabilities.Over time the SMEs and the central data team share experiences and knowledge. This creates a productive ground for working together. The more understanding business users gain on data, the more they want to build themselves.Central team delivers core data assets in a robust and stable manner. Business teams can build on that.The DataYou can integrate and combine internal data with external sources (like weather data, or road network data) to create valuable insights.Utilizing external data can save you efforts, since it often is structured and API ready.Dont over-engineer solutions - find you what your user-requirements are and provide data that match the requirements, not more.Use an agile approach to prioritize use cases together with your business users.Ensure you have a clear picture of potential value, but also investment and cost.Work in short iterations, to provide value quickly and constantly.Understand your platform constrains and limitations, also related to quality.Find your WHY! Why am I doing the work and what does that mean when it comes to prioritization?What is the value, impact and effort needed?Service Platform:Is about offering self-service functionality.Due to the size of Mesta it made sense to take ownership for many data products centrally, closely aligned with the platform.Build it as a foundation, that can give rise to different digitalization initiatives.If you want to make data actionable they need to be discoverable first.The modular approach to data platform allows you to scale up required functionality when needed, but also to scale to zero if not.Verify requirements as early as you can.Working with business use casesVisibility and discoverability of data stays a top priority.Make data and AI Literacy use case based, hands-on programsYou need to understand constrains when selecting and working with a business use case.Start with a time-bound requirements analysis process, that also analyses constraints within the data.Once data is gathered and available on the platform, business case validity is much easier to verify.Gather the most relevant data first, and then see how you can utilize it further once it is structured accordingly.Quite often ideas originate in the business, and then the central data team is validating if the data can support the use case.
«I think we are just seeing the beginning of what we can achieve in that field.»Step into the world of data science and AI as we welcome Victor Undli, a leading data scientist from Norway, who shares his insights into how this field has evolved from mere hype to a vital driver of innovation in Norwegian organizations. Discover how Victor's work with Ung.no, a Norwegian platform for teenagers, illustrates the profound social impact and value creation potential of data science, especially when it comes to directing young inquiring minds to the right experts using natural language processing. We'll discuss the challenges that organizations face in adopting data science, particularly the tendency to seek out pre-conceived solutions instead of targeting real issues with the right tools. This episode promises to illuminate how AI can enhance rather than replace human roles by balancing automation with human oversight.Join us as we explore the challenges of bridging the gap between academia and industry, with a spotlight on Norway's public sector as a cautious yet progressive player in tech advancement. Victor also shares his thoughts on developing a Norwegian language model that aligns with local values and culture, which could be pivotal as the AI Act comes into play. Learn about the unique role Norway can adopt in the AI landscape by becoming a model for small countries in utilizing large language models ethically and effectively. We highlight the components of successful machine learning projects: quality data, a strong use case, and effective execution, and encourage the power of imagination in idea development, calling on people from all backgrounds to engage.Here are my key takeaways:Get started as Data ScientistExpectations from working with cutting edge tech, and chasing the last percentage of precision.Reality is much more messy.Time management and choosing ideas carefully is important.«I end up with creating a lot of benchmark models with the time given, and then try to improve them in a later iteration.»Data Science studies is very much about deep diving into models and their performance, almost unconcerned with technical limitations.A lot of tasks when working with Data Science are in fact Data Engineering tasks.Closing the gap between academia and industry is going to be hard.Data Science is a team sport - you want someone to exchange with and work together with.Public vs. PrivatThere is a difference between public and privat sector in Norway.Public sector in Norway is quite advanced in technological development.Public sector acts more carefully.Stakeholder Management and Data QualityIt is important to communicate clearly and consistently with your stakeholders.You have to compromise between stakeholder expectation and your restrains.If you don't curate your data correctly, it will loose some of its potential over time.Data Quality is central, especially when used for AI models.Data Curation is also a lot about Data Enrichments - filling in the gaps.AI and the need for a Norwegian LLMAI can be categorized into the brain and the imagination.The brain is to understand, the imagination is to create.We should invest time into creating open source, Norwegian LLM, as a competitive choice.Language encapsulates culture. You need to embrace language to understand culture.Norways role is a sa strong consumer of AI. That also means to lead by example.Norway and the Nordic countries can bring a strong ethical focus to the table.
«Focusing on the end-result you want, that is where the journey starts.»Curious about how Decision Science can revolutionize your business? Join us as our guest Rasmus Thornberg from Tetra Pak guides us through his journey of transforming complex ideas into tangible, innovative products.Aligning AI with business strategies can be a daunting task, especially in conservative industries, but it's crucial for modern organizations. This episode sheds light on how strategic alignment and adaptability can be game-changers. We dissect the common build-versus-buy dilemma, emphasizing that solutions should focus on value and specific organizational needs. Rasmus's insights bring to life the role of effective communication in bridging the divide between data science and executive decision-making, a vital component in driving meaningful change from the top down.Learn how to overcome analysis paralysis and foster a learning culture. By focusing on the genuine value added to users, you can ensure that technological barriers don't stall progress. Rasmus shares how to ensure the products you build align perfectly with user needs, creating a winning formula for business transformation.Here are my key takeaways:Decision ScienceYou need to understand the cost of error of a ML/AI applicationCost of error limits the usability of AIDecision Science is a broader take on Data Science, combining Data Science with Behavioral Science.Decision Science covers cognitive choices that lead to decisions.Decision Science can just work in close proximity to the end user and the product, something that has been a challenge for many.From Use Case to productLots of genAI use cases are about personal efficiency, not to improve any specific organizational target.Differentiating between genAI and analytical AI can help ton understand what the target is.genAI hype has created interest from many. You can use it as a vessel to talk about other things related to AI or even to push Data Governance.When selecting use cases, think about adoption and how it will affect the organization at large.When planning with a use case, find where uncertainties are and ability for outcomes.It's easy to jump to the HOW, by solving business use cases, but you really need to identify the WHY and WHAT first.Analysis-paralysis is a really problem, when it comes to move from ideation to action, or from PoC to operations.«Assess your impact all the time.»You need to have a feedback loop and concentrate on the decision making, not the outcome.A good decision is based on the information you had available before you made a decision, not the outcome of the decision.A learning culture is a precondition for better decision making.If you correct your actions just one or two steps at a time, you can still go in the wrong direction. Sometimes you need to go back to start and see your entire progress.The need for speed can lead to directional constrains in your development of solutions. Be aware of measurements and metrics becoming the target.When you build a product, you need to set a treshold for when to decommission it.Strategic connectionThe more abstract you get the higher value you can create, but the risk also gets bigger.The biggest value we can gain as companies is to adopt pur business model to new opportunities.The more organizations go into a plug-n-play mode, the less risk, but also less value opportunities.Industrial organizations live in outdated constrains, especially when it comes to cost for decision making.Dont view strategy as a constrain, but rather a direction that can provide flexibility.
«We made a transition from being a company that produces a lot of data, to a company which has control over the data we are producing.»Unlock the secrets of optimizing supply chains with data and AI through the lens of TINE, Norway's largest milk producer. Our guest, Olga Sergeeva, head of the analytics department at Tine, takes us on her journey from a passion for mathematics to spearheading digital transformation in the fast-moving consumer goods industry.Ever wondered how organizations can successfully integrate AI tools into their business processes? This episode dives into the uneven digital maturity across departments and the strategies used to overcome these challenges. We discuss how data visualization tools act as a gateway to AI, making advanced algorithms accessible without needing to grasp the technical nitty-gritty. Olga shares how TINE's data department empowers users by providing crucial expertise while ensuring they understand the probabilistic nature of AI-generated data.Finally, discover how teamwork and a systematic approach can drive data adoption to new heights. From improving milk quality with predictive algorithms to optimizing logistics and production planning, we explore practical AI use cases within Tine's supply chain.Here are my takeaways:Mathematics is a combination of beauty, art and structure.Find your way in data and digitalization before jumping on the AI-train.Ensure that people can excel at what they are best at - this is what Tine tries to do for the farmers.Data only has a value, when it can be used - find ways to use data from analytics to prediction to more advanced algorithms.Create a baseline through a maturity assessment to see how you can tailor your work to the different business units.Follow up and monitor the usage of your data tools in the different areas of your businessCreate a gateway into data for your business users: Once that gateway is established it is also easier to introduce new tools.Data Literacy has a limit - not everyone in the business needs to be a data expert.Yet you need someone you can trust to enable and provide guidance - the Data team.Business users need to understand the difference between concrete answers and probability.How do you transform a complex organization without breaking the culture?Your data/digital/AI transformation team is key in ensuring good transformative action without breaking culture.Ensure you have good ambassadors for your data work in the Business Units, that what to transfer their knowledge in their respective units.Create a network of data-interested people, that help to drive adoption.Engage people by showing an initial value.Offer courses and classes for people to learn and understand more, but also to spread the word about your focus points.Inhouse courses provided by your own staff can increase the confidence in your data team.AI can mean different things to different people. It is important to define AI in your setting.Don't replace existing work process with AI-driven solutions, just for the sake of it. Find ways to focus on where improvement actually provides business value.When you think of a new AI project, you have several options:Develop in houseBuy off the shelfDo nothing Option two should be your preferred solutionAI strategy is part of a larger ecosystem, with conditions to adhere to.Data and algorithms should become interconnected, also visually represented.«Always remember your core business.»
"Det er vanskelig å komme seg ut av det jeg kaller: et excel-helvete. / It is hard to escape, what I call: Excel-hell."Are you wondering how medium-sized companies can handle data strategy and data governance effectively? Join us as we talk to May Lisbeth Øversveen, who has over 23 years of experience in the industry, and shares her expertise from Eidsiva Bredbånd. She provides us with insight into how to work with data maturity and the implementation of data strategy.How can mid-sized companies balance resources and create effective data governance strategies? May Lisbeth and I explore this topic in depth. We talk about the importance of involving the business units early in the process in order to create ownership and commitment around the improvement measures.Here are my key takeaways:The way we talk about data as a profession has changed, the lingo has changed and we adopt to trends.To display and evaluate data from different sources that are not connected, excel becomes the tool of choice.There is a very calculated amount of resources, that limit your ability to set up substantial teams to work exclusively on eg. Data Governance.Data Governance in SME (Small Medium sized enterprises) can be modeled as a repeatable process that incrementally enhances your data governance maturity.Identify sizable initiatives, ensure that they can be handled with a set amount of resources, and create metrics that enable you to track your progress.You need to find ways to ensure observability and monitoring over time.Don't create something that you have no resources to maintain and improve going forward.To identify the right initiatives at the right time, you need to ensure a close collaborating with your business users.Ensure transparent and traceable ownership of the initiatives from the business side.To create a movement and engagement in data requires continuous and structured communication.Data Maturity AssessmentThere is a need for speed and agility in SME, to ensure compatibility.Data Maturity Assessments are a welcome introduction to ensure that you create a baseline when working with data.There are advantages to both an internal view and to get some external perspective on your data maturity.Results from a maturity assessment can be a reality check that is not always easy to convey, yet you need to be realistic.Maturity assessments should ideally be both Modeled/tailored to the needs of the organizations in question.Repeatable and comparable over time and across organizations.Good assessments cover both.To initially increase your maturity you can pick different tasks:Low hanging fruits«Duct-taped» operations that you can finally rectifyFind known problems that are visibleFind pinpoints for your business usersIt is good to start with cases that are understandable for business users, create interest, and can easy show value to leadership - this is what creates buy-in.You need to ensure that you keep a clear communication towards bigger, more substantial tasks, so your resources are not limited to quick win actions.Data StrategyData Strategy needs to be closely aligned with business strategy.Have a clear vision of where you want to go.To have a structure approach run our data strategy on how to handle both people, process, and technology is important for any work with data.Technology is not the staring point, but rather a consequence of your strategic choices, your organizational setup and your available resources.You need to include well-defined metrics to track progress.Find metrics that are closely connected to business outcome and value creation.
«The notion of having clean data models will be less and less important going forward.»Unlock the secrets of the evolving data landscape with our special guest, Pedram Birounvand, a veteran in data who has worked with notable companies like Spotify and in private equity. Pedram is CEO and Founder at UnionAll. Together, we dissect the impact of AI and GenAI on data structuring, governance, and architecture, shedding light on the importance of foundational data skills amidst these advancements.Peek into the future of data management as we explore Large Language Models (LLMs), vector databases, and the revolutionary RAG architecture that is set to redefine how we interact with data. Pedram shares his vision for high-quality data management and the evolving role of data modeling in an AI-driven world. We also discuss the importance of consolidating company knowledge and integrating internal data with third-party datasets to foster growth and innovation, ultimately bringing data to life in unprecedented ways.Here are my key takeaways:Always when a new technology arrives, you need to adopt and figure out how to apply the new technology - often by using the new tools for the wrong problem.There is substantial investment in AI, yet the use cases for applying AI are still not clear enough in many companies.There is a gap I how we understand problems between technical and business people. Part of this problem is how we present and visualizer the problem.You need to create space for innovation - if your team is bugged down with operational tasks, you are canibalizing on innovative potential.Incubators in organizations are valuable, if you can keep them close to the problem to solve without limiting their freedom to explore.The goal of incubators is not to live forever, but top become ingrained in the business.CEOs need a combination of internal and external council.Find someone in the operational setting to take ownership from the start.The more data you have to handle the better and clear should your Data Governance strategy be.Small companies have it easier to set clear standards for data handling, due to direct communication.You want to make sure that you solve one problem really well, before moving on.Before intending to change, find out what the culture and the string incentives in your organization are.LLMs as the solution for Data Management?ChatGP already today very good at classifying information.It can create required documentation automatically, by feeding the right parameters.It can supersede key value search in finding information.This can help to scale Data Governance and Data Management work.Data Management will become more automated, but also much more important going forward.RAG architecture - first build up your own knowledge database, with the help of vectorizing the data into a Vector-database.The results from querying this database are used by the LLM for interpretation.Find a way to consolidate all your input information into a single pipeline to build your knowledge database.Building strong controls on naming conventions will be less important going forward.Vectorized semantic search will be much faster.Entity matching will become very important.Fact tables and dimensional tables become less important.Data to valueBe able to benchmark your internal performance to the marketundertand trends and how they affect you.How to use and aggregate third party data is even harder than internal data.You need to find ways to combine internal and third party data to get better insights.
«Don't go over to the cloud without truly understanding what you are getting into.»Unlock the secrets of cloud migration with industry expert Jonah Andersson, a senior Azure consultant and Microsoft MVP from Sweden. Learn how to seamlessly transition your data systems to the cloud. Jonah shares her knowledge on cloud infrastructure, AI integration, and the balance between Edge AI and Cloud AI, providing a comprehensive guide to building resilient cloud systems.Explore the intersection of IT consulting, Data Governance, and AI in cloud computing, with a specific focus on security and agile workflows. Understand the critical impact of GDPR on data management and the essential collaboration between IT consultants and data governance experts. Jonah and I delve into the growing trend of edge AI, driven by security and latency concerns, and discuss responsible AI usage, emphasizing security and privacy. Learn how to navigate the complexities of multi-cloud strategies and manage technical debt effectively within your organization.Jonah offers tips on avoiding common migration mistakes and highlights the significance of using tools like Azure's Cloud Adoption Framework. Whether you're modernizing outdated systems, merging companies, or transitioning to a new cloud provider, this episode equips you with the essential knowledge and resources to ensure a successful and strategic cloud migration journey. Join us for a deep dive into the future of cloud computing with an industry leader.Here are my key takeaways:Azure services can be tailored to use cases and service needs. But you need to understand your requirements and needs.Once you understand what you need to do, you need to gain perspective in the how - what methods and processes are supported?Think security at every step.Security with integrations is an important part, we need to focus more on.Bringing different competencies together is a vital ingredient in building resilient applications.Cloud is about where your data resides, how you protect it and how you handle big data.Cloud should support the entire data lifecycle.Cloud and AI«Cloud computing is the backbone of AI.»AI pushed for Edge AI, in addition to cloud. Reasons for Edge AI are latency, but mainly security.Cloud can provide an attack surface for eg. data poisoning, lack of control for training data, etc.AI tools can pose concerns on what and how you are exposing data.Awareness and education are important, when building something with AI.You need to at least understand your input to track your output - explainability starts with understanding of your data sources.There is a risk to Model Governance by on-perm due to the level of competancy needed.Multi-Cloud vs. Single CloudThis is one of the questions to consider at the beginning of a cloud migration.Drivers for multi cloud strategy are: Avoiding proprietary vendor lock-in,Existing applications or infrastructure in another platform,Choosing according to the quality of services offered by cloud vendors.If you choose multi cloud for automated resource management, you need to consider support platforms.Cloud MigrationReason for cloud migration boil often down to gaining resiliency in the cloud, due to redundancy.You need to uphold Data Quality not just after the migration but also during the transit.Cloud migration requires strategy.There are great resources to help with your cloud migration, like the Cloud Adaption framework or the Well-Architected framework.Use observability and orchestration tools for your migration process.Ensure you understand your cost, and can optimize it to fit with your needs.
"For me, it really goes back to basic human needs, almost."How can the sense of community support Data Professionals? We dive deep into this question with Tiankai Feng, a prominent figure in data governance and the Data Strategy and Data Governance lead at ThoughtWorks Europe. In this season four premiere of MetaDAMA, Tiankai shares his unique journey and how his passion for music plays a pivotal role in his professional and personal life. His story underscores the multidimensional nature of data professionals and the importance of a supportive community.Building and nurturing internal communities is crucial. Tiankai and Winfried discuss how data governance conferences serve as therapeutic spaces, offering more than just professional development—they provide emotional and communal support. We explore various community models like grassroots movements and rotational leadership, highlighting the indispensable role of leadership in fostering these spaces. Recognizing and valuing community leaders is essential for sustaining these supportive networks within organizations.Lastly, we delve into practical strategies for building strong data management communities. From integrating community introductions into onboarding processes to using these groups as recruitment tools, we cover it all. We also examine how company culture shapes the type of communities that flourish and the support provided by external organizations like DAMA. Joining communities helps alleviate isolation, share solutions, and foster a connected environment. Tune in to learn how to make community engagement a cornerstone of data governance and elevate both personal and professional growth.Here are my key takeaways:Communities in organizationsCommunity is needed as a counterpart to the transactional behavior in a workplace.Communities of Practice is an established model, that comes from a technical side, methodology focuses.Communities can create new lines of communication, that can help spread a sense of belonging in an organization, beyond a specific department or team.Leadership needs to accept that being in a Community is also part of the job.Community leaders need recognition and to be valued for their work.The «smartest person in the room» should not be the leader of a Community - this can turn a community into a lecture setting.Ensure that organizational hierarchies are «flattened» in a Community, to support physiological safety and freedom to speak.Ensure you have some rules of engagement or code of conduct in place.Breakout groups can be a way to get everyone to participate actively in the Community.Leadership plays an important role to promote Communities in an organization.Well functioning Communities of Practice can become a selling point for recruitment.DAMA as a CommunityA Community for Data professionals outside their organizations.The most outstanding impact DAMA can have is networking in a broad community, both local/national, but also internationally across sectors.There is an element of mentioning and coaching that a community of this size can offer.Another factor can be talent-sourcing: both for organizations, but also for job-seekers.Upscaling and learning are a great part of the DAMA Community, also including the CDMP certification.You need to find your balance between domain or sector specific communities and large data communities like DAMA.SOME CommunityYou need to be conscious about what you are reading on SOME.It can be a great place to provoke some new thoughts and get perspective on your work.There is certainly an entertainment factor to using SOME. Humor can heal a lot, and laughing about challenges we face as Data folks is like therapy.
«We can get lost in politics, when what we should be discussing is policy.»In this seasons final episode, we're thrilled to have Ingrid Aukrust Rones, a policy expert with a rich background in the European Commission and Nordheim Digital, shed light on the role of the global geopolitical landscape in shaping digital policies.Explore with us the dominant influence of big tech from the US to China, and how the EU's regulatory approach aims to harmonize its single market while safeguarding privacy and democracy. Ingrid breaks down the contrasting digital policies of these regions and discusses how the EU's legislative actions are often driven by member states' initiatives to ensure market cohesion. We also chart the historical shifts in digital policy and market regulations from the 1980s to the present, highlighting key moments like China's WTO entry and the introduction of GDPR.Lastly, we delve into the future landscape of digital societies and the challenges nation-states face within the context of Web3. Ingrid emphasizes the concentration of power in big tech and its potential threat to democracy, while also lauding the EU's robust regulatory measures like the Digital Markets Act and the Digital Services Act.Here are my key takeaways:Geopoliticsour security, economy, the national and international system relies on data.How data is collected, stored, protected, used, transferred, retained.. happens as much across boarders as within.Data Strategy on this geopolitical level is about creating a digital autonomy, not being reliant on big international enterprises, but for our political system to stay sovereignUS is based on a liberal, free market model that is very innovation friendly.China is based on a very controlled environment, with limited access to their domestic market. Incubation of local companies, shield from global competition.The EU is setting the regulatory standard. Freedom is balanced with other values, like fairness or democracy.We need to talk about the role that big tech has on the global scene.Geopolitical impact on digital policies.Ingrid has a role between policy and business, coordinating and finding opportunities between both.EU has set the global standard in how we could deal with data and AI from a regulatory perspective.Politics are the decisions we make to set the direction for society.«Policy is the plan and implementation of what is decided through politics.»Cultural differences influence how we perceive, utilize and establish global policies, but also how we work with data in a global market.We have an issue if we only think in 4-5 year election cycles for tackling long term issues.The EURegulation is the biggest tool the EU has.«We are always in competition with technology, because technology develops so fast, and legislation develops so slowly.»You can see a change in responsibility for enforcement of EU rules and regulations, where implementation is moved from national responsibility to EU responsibility.The EU system is not any easy system to understand from the outside.The rise of Big TechWe can go back to the anti-trust laws from the 1980s that opened for much more monopolistic behavior.The rise of the internet had a large influence on big tech.The liability shield was a prerequisite for social media platforms to gain traction.Big tech has created dependency for other organizations due to eg. their infrastructure offerings.We need to be aware of that concentration of power in the market.Big Tech is not just leading but also regulating the development of the market.Bigger companies that are competing with Big Tech, feel their influence and size the most.
«Hva er mulig å gjøre med disse teknologiene når de blir 10 ganger så bra som de er idag? / What might be possible to do with these technologies when they become 10 times as good as they are today?»Can moonshot innovation really be the key to solving challenges that traditional methods fail to address? Today, we're thrilled to welcome Yngvar Ugland from DNB's New Tech Lab, who will unravel the complexities of digital transformation and share his unique insights from both corporate and startup ecosystems. From breaking the mold of the classic "people, process, technology" framework to stressing the importance of customer-centric approaches, Yngvar's perspective offers a refreshing and profound look into fostering genuine innovation within established enterprises.Technological innovation isn't always smooth sailing, and Yngvar helps us understand the friction between traditional mindsets and innovative approaches. Balancing high-trust societies against the urgency-driven dynamics of capitalism, we discuss the complex landscape of AI hype and explore technologies like GPT-3 and GPT-4. With an optimistic outlook, Yngvar encourages us to embrace the transformative potential of generative AI, highlighting the unprecedented opportunities that lie ahead. Tune in to gain a deeper understanding of the ever-evolving world of technology and digital transformation.Here are my key takeaways:Yngvar has build and is leading the as he calls it «Moon-shoot unit at DNB».What do we need to do to actually implement and adopt to new technology and ways of working?How do we think tech for people in tech?We can identify three needed dimensions for change: a data / tech component, a business component and a change component.There is a difference between necessary and sufficient - just because a change is necessary, doesn't mean that the proposed solution is sufficient.You need to find ways to navigate uncertainty, be active beyond concrete hypothesis testing, or tech-evaluation.For organizations to be successful, you need to coordinate both maintenance, improvement and innovation - it's not one of those, but all there in concert that can ensure success over time.Innovation and digital transformation is not a streamlined process.Uncertainty offers a space for opportunity.We use the term agile without grasping its true meaning - an inspect-and-adapt mindset is key to agile.The development from GPT-1 through GPT-2 to GPT-3 is an example for the exponential development of technology.The digital infrastructure in Norway, that can utilize data and technology for value creation across public and private sectors is a reason for our success.The difference to the US market is that there are large cooperations that take on societal challenges.How our society is structure has an influence on how we perceive the need for innovation.It is natural to meet resistance in change and innovation.To iterate effectively you really need to live a mindset build around FAIL - First Attempt in Learning.We overestimate the effect of technology in the short term and significantly underestimate the long term.
"Det var jo veldig urealistisk å tenke kanskje at en haug med folk som har matematisk eller Computer Science bakgrunn, skal komme inn og skjønne forretningen. / It was very unrealistic to think that maybe a bunch of people with a mathematical or computer science background would come in and understand the business."Join us on Metadama as we welcome Erlend Aune, an accomplished data science expert with a rich background in both academia and industry. Through real-world examples from the Norwegian industry, we illustrate how successful research collaborations and technology transfers can stimulate innovation and create value. Despite the promising advances, we also candidly address the cultural and operational challenges businesses encounter when integrating AI research into their workflows.What practical steps can bridge the gap between theoretical education and real-world application? Our conversation further explores the intersection of business development and the practical application of machine learning and data science. We emphasize the need for environments that foster hands-on experience for students, such as hackathons and industry-linked thesis projects. Additionally, we discuss the importance of tailored training development within organizations, focusing on understanding trainee characteristics to achieve meaningful training outcomes. Tune in to gain valuable insights and actionable advice on nurturing the next generation of data scientists and enhancing organizational capabilities.Here are my key takeaways:Data Science and Business DevelopmentData science needs a strong connection to business development You need to embed Data Science in a cross-functional environmentBusiness acumen needs to be ingrained in the work with dataData Science needs to start from a Business side - ensure that you work on the problems that generate value for your organization.Data Science works with probability, not certainty - this notion is not yet understood by everyone in business.Data organizations are often build on an engineering mindset, that can be contradictive to an exploratory mindset.Even when designing Data Warehouse, you need to understand the business impact, have a business development mindset.Norway & AINorway has a great AI and ML research community.The public discourse on AI portraits a quite narrow view, that doesn't reflect the broad application and research done in the field.Research & BusinessResponsible AI is not a one-size fits all. Different organizations have different needs, for either certainty, security, reliability of outcome, etc. So a rAI approach needs ton be tailored to the business need.Startups and companies that have products related to the AI research environment, have the advantage that products are improved in tact with research development.In addition to in-house R&D, organizations can collaborate directly with research environments at universities.You cannot do R&D just as a pocket of excellence, if you want to operationalize results in your organization.We need to shorten the distance between R&D and operations.For the Data Science StudentIf you apply knowledge on different challenges, you will get an intuition on how to solve a broad variety of challenges.When selecting a task within an organization as a Master thesis, make sure the task is delimited.Traits to succeed as a student working in industry:Interest in your disciplineInterest in the organization and its sectorProblemsolvingCreativity
"We don't need Data Governance where we don't have anything to fix."How can Data Diplomacy transform an organization into a data-driven organization? This episode brings Håkan Edvinsson, a visionary in data management and governance, into the conversation, revealing the intricacies and impacts of Data Diplomacy in Nordic organizations. Håkan's journey from business data modeling in the 90s to robust governance practices today offers a treasure trove of insights. Together, we dissect the evolution of enterprise architecture and its role in business innovation.Discover how data governance is not just about maintaining quality but is a dynamic force that propels organizations forward with each structural change. We discuss the concept of data design and how this approach is shaping the future of responsible data usage in companies like Volvo Penta and Gothenburg Energy. Our dialogue uncovers the importance of integrating governance into decision-making and planning, ensuring data is not just managed but used as a strategic asset for innovation.The finale of our discussion broadens the horizon, touching upon artificial intelligence and its relationship with traditional data practices. We challenge the status quo, urging businesses to embrace a leaner governance model that aligns with Lean and Agile methodologies. Alongside this, we unravel the subtle yet crucial distinction between data and information, arguing for a proactive business ownership in data design and governance.Here are my key takeaways:If you want an organization to last, someone has to define key terms.Data Governance and Data Quality should not be done reactively, but rather by design.Enterprise ArchitectureConnecting the work of EA to certain project gates, is underpinning a reactiveness in EA.EA claims to be the master interpreter of business needs, yet EA artifacts are based on second hand knowledge.Architecture as well as Governance are supporting a development, not dictating it.EA is NOT the business designer, just an interpreter, a facilitator, that enables those with 1st hand knowledge.Don't generalize away from business reality.Data DiplomacyAs long as you are working with operational data, you need to embrace business data design.You need to bridge Business with IT.The «gravity for change», mainly through external factors provide management attention.Use these external triggers to create more with less.Dont talk solutions and technology - too many opinions. Stick to the data.Focus on what data should look like. Base your work on the facts.Enable people to understand data, requires Data Governance to take a facilitator role, not an excellence role.«Being a hero once doesn't mean you are lasting.» - you need to find a sustainable way of doing data work, beyond task based, checklist compliance.Establish a Data Governance network that represents the entire organization.A common language and established tacit knowledge can speed up processes.You need to be ready, prepared, and on the edge to ensure you are resilient to change.Integrate your data decisions into the management structure.Firefighting gets more credit then fire prevention.Traditional Data Governance is too focused on operational upkeep, laking a future outlook.Data Governance don't rely have the means to state: What should it look like in tomorrows world?Entity Manager: taking charge of definition, label and structure of a certain data entity, of the data that we should have.A Facilitator works with these entity mangers in their respective area.Advice against top-down, classical Data Governance implementation.
«AI will be so important in transforming health care as we know it today."Join us as we sit down with Elisabeth M.J. Klaussen from DoMore Diagnostics, who are on a mission to transform cancer diagnostics with artificial intelligence to improve patient care and make drug development more effective. With a rich background in quality assurance and R&D within Pharma, Biotech, and MedTech, Elisabeth shares how AI is revolutionizing patient care and the pathway to personalized medicine.Navigating the complexities of starting a healthcare venture can be as intricate as the regulations that govern it. In this episode, we discuss the maze of regulations across continents, the implications of the European AI Act for innovators, and the non-negotiable necessity of protecting patient data.Wrapping up our dialogue, we emphasize the importance of a Quality Management System (QMS), especially when developing AI models. As we delve into the EU's AI Act and its potential to harmonize standards, Elisabeth offers invaluable advice to health startups: the development of a robust QMS is not just a regulatory tick box but a foundational pillar for market readiness. Here are my key takeaways:AI in Health Care:Personalized medicine requires to analyze a lot of data and set it in a personalized context.To create value with AI in health care is challenging, due to the high density of regulations, yet benefits can be huge.AI can enable us to use investments in pharmaceuticals, biotech as well as patient care more effectively.You need to ensure you can constrain AI models, not only on the data input, but also through use of parameters or model-architecture.The product from DoMore Diagnostics is i.e. a static model, not generative, that gives an output on leanings only.There is a need to apply for a new CE marking, if model would change.Regulations in Health Care:You need to understand both your product and its intended purpose to understand what regulation will apply to you.You need to set up a team with the right people and competency.Try to find generalists - People that have a core competency, but are really good at adopting and learning new surrounding competencies at a more generalist level to complement each other.Laws and regulations in the industry are getting more and more globally standardized.If you adhere to the area with the most stringent rules, you can basically introduce your product to any market you like.If you set up your organization for regulatory compliance, you have two perspectives to keep in mind: Internally - how do you set up your principles, polices and processes internally?How do you act towards your sector and market?The regulation on EU level provides a framework, within you can find national regulations and laws that go beyond. One example is product labeling that can vary between EU countries.The EU AI Act:The EU AI Act introduces requirements that the heavily regulated industry is following already. (E.g. quality systems, documented design and development of your product, validations, performance studies)EU regulations are political documents, that are build on compromise.There is a huge constraint within the EU commission as well as on the authority side to take on the workload that results from the AI Act and other new regulations.The more cumbersome regulations are and the more regulations you build in, the more expensive will products get.Standards and regulations can help to structure your ways of working, ensuring efficiency, not wasting time and money in doing things over and over again.«You can be more creative, if you have a structured way of working.»
«Don't make it hard to understand for the business. Make it simple and clear.»Get new perspectives on Data Governance with Valentina Niklasson from Volvo Penta as she talks about certain patterns, stages in the acceptance of Quality Management or Lean, that Data has to go through. Her rich experience in making Data Governance business-centric emerges, showcasing how you can get an organization engaged in Data.Gain insights on the synergy between lean methodology and effective Data Management. We explore the application of the PDCA Deming circle in Data and discuss how common languages and methodologies bridge the gap between Data, IT and business. This convergence is not just theoretical; it's a practical pathway to tapping into customer insights, translating needs into strategies, and fostering a culture where continuous improvement reigns.Finally, we delve into the human aspect of Data and Data Stewardship, emphasizing the importance of people over technology in cultivating a data-driven culture. By engaging the curious early and involving them in the development of business information models, we build ambassadors within the business, ready to champion change. Valentina and I talk about the dynamic role of Data Stewards and the approach to involving business personnel, ensuring the smooth adoption of new processes and strategies.Here are my key takeaways:Quality management as inspirationData is still treated as an IT problem, but should really be treated as a business problem.We need to find a better way to communicate across data, IT and business.Use the same methodology wherever possible and try to reduce complexity in processes.Try to adapt to the ways of working in the business. Not creating own ways on digital, data or IT.You need to understand customer relations, end customers and the entire value chain to define needs correctly.Standardized ways of working can help to do right from start.Deming Cycle, PDCA, can be directly adopted to data. Think of data as the product you are building, that should have a certain quality standard.Don't make it hard to understand for the business:Using the same forms and approaches.Business data driven process.Let the business take part in the entire process.Lean Methodology should take a bigger place in data.A product management mindset makes data quality work easier.Data Stewardship You need to ensure owning the problem as well as the solution.High data quality is vital for data-driven organization. Someone needs to ensure this.Stewardship can have a negative connotation. The technical demands on Data Stewards are really big today.Data Stewardship works if the Data Steward is part of a broader team.The role of Steward needs to be adjusted to the fast-speed reality.Data Stewards need to be able to solve problems, not only report to a central organization.Data Stewards should be approached in the business. You need that domain knowledge, yet they cannot perform the entire stewardship role.Most important to empower Data Stewards to start working and analyzing the challenges ahead.Don't force Data Stewards to be technical data experts. That should be a supportive role in the Digital / data organization.If you build something new, engage Data Stewards from the beginning. You cannot take responsibility for something you don't understand.If you want to be sustainable in Data, you need to help the people in your organization to be part of the journey.It's not only about hiring new competency, but engaging with the knowledge you have in your organization.
«Dataen i seg selv gir ikke verdi. Hvordan vi bruker den, som er der vi kan hente ut gevinster.» / «Data has no inherent value. How we use it is where we can extract profits.»Embark on an exploration of what a data-driven Police Force can be, with Claes Lyth Walsø from Politiets IT enhet (The Norwegian Police Forces IT unit).We explore the profound impact of 'Algo-cracy', where algorithmic governance is no longer a far-off speculation but a tangible reality. Claes, with his wealth of experience transitioning from the private sector to public service, offers unique insights into technology and law enforcement, with the advent of artificial intelligence.In this episode, we look at the necessity of integrating tech-savvy legal staff into IT organizations, ensuring that the wave of digital transformation respects legal and ethical boundaries and fosters legislative evolution. Our discussion continuous towards siloed data systems and the journey towards improved data sharing. We spotlight the critical role of self-reliant analysis for police officers, probing the tension between technological advancement and the empowerment of individuals on the front lines of law enforcement.We steer into the transformation that a data-driven culture brings to product development and operational efficiency. The focus is clear: it's not just about crafting cutting-edge solutions but also about fostering their effective utilization and the actionable wisdom they yield. Join us as we recognize the Norwegian Police's place in the technological journey, and the importance of open dialogue in comprehending the transformations reshaping public service and law enforcement.Here are my key takeaways:Norwegian police is working actively to analyse risks and opportunities within new technology and methodology, including how to utilize the potential of AI.But any analysis has to happen in the right context, compliant within the boundaries of Norwegian and international law.Data Scientists are grouped with Police Officers to ensure domain knowledge is included in the work at any stage.Build technological competency, but also ensure the interplay with domain knowledge, police work, and law.Juridical and ethical aspects are constantly reviewed and any new solution has to be validated against these boundaries.The Norwegian Police is looking for smart and simple solutions with great effect.The Norwegian Police is at an exploratory state, intending to understand risk profiles with new technology before utilizing it in service.There is a need to stay on top of technological development of the Norwegian Police to ensure law enforcement and the security of the citizens. This cannot be reliant on proprietary technology and services.Prioritization and strategic alignment is dependent on top-management involvement.Some relevant use cases:Picture recognition (not necessarily face-recognition) - how can we effectively use picture material from e.g. crime scenes or large seizure.Language to text services to e.g. transcribe interrogations and investigations. Human errors are way harder to quantify and predict then machine errors.This is changing towards more cross-functional involvement.The IT services is also moving away from project based work, to product based.They are also building up a «tech-legal staff», to ensure that legal issues can be discussed as early as possible, consisting of jurists that have technology experience and understanding.Data-driven police is much more than just AI:Self-service analysis, even own the line of duty.Providing data ready for consumption.Business intelligence and data insights.Tackling legacy technology, and handling data that is proprietary bound to outdated systems.
«If you want to run an efficient company by using data, you need to understand what your processes look like, you need to understand your data, you need to understand how this is all tied together.»Join us as we unravel the complexities of data management with Olof Granberg, an expert in the realm of data with a rich experience spanning nearly two decades. Throughout our conversation, Olaf offers insights that shed light on the relationship between data and the business processes and customer behaviors it mirrors. We discussed how to foster efficient use of data within organizations, by looking at the balance between centralized and decentralized data management strategies.We discuss the "butterfly effect" of data alterations and the necessity for a matrix perspective that fosters communication across departments. The key to mastering data handling lies in understanding its lifecycle and the impact of governance on data quality. Listeners will also gain insight into the importance of documentation, metadata, and the nuanced approach required to define data quality that aligns with business needs.Wrapping up our session, we tackle the challenges and promising rewards of data automation, discussing the delicate interplay between data quality and process understanding.Here are my key takeawaysCentralized vs. DecentralizedDecentralization alone might not be able to solve challenges in large organizations. Synergies with central departments can have a great effect in the horizontal.You have to set certain standards centrally, especially while an organization is maturing.Decentralization will almost certainly prioritize business problems over alignment problems, that can create greater value in the long run.Without central coordination, short-term needs will take the stage.Central units are there to enable the business.The Data Value ChainThe butterfly effect in data - small changes can create huge impacts.We need to look at value chains from different perspectives - transversal vs. vertical, as much as source systems - platform - executing systems.Value chains can become very long.We should rather focus on the data platform / analytics layer, and not on the data layer itself.Manage what's important! Find your most valuable data sources (the once that are used widely), and start there.Gain an understanding of intention of sourcing data vs. use of data down stream«It's very important to paint the big picture.»You have to keep two thoughts in mind: how to work a use-case while building up that reusable layer?Don't try to find tooling that can solve a problem, but rather loo for where tooling can help and support your processes.Combine people that understand and know the data with the right tooling.Data folks need to see the bigger picture to understand business needs better.Don't try to build communication streams through strict processes - that's where we get too specialized.Data is not a production line. We need to keep an understanding over the entire value chain.The proof is in the pudding. The pudding being automation of processes.«Worst case something looks right and won't break. But in the end your customers are going to complain.»«If you automate it, you don't have anyone that raises their hand and says: «This looks a bit funny. Are we sure this is correct?»»You have to combine good-enough data quality with understanding of the process that you're building.Build in ways to correct an automated process on the fly.You need to know, when to sidetrack in an automated process.Schema changes are inevitable, but detecting those can be challenging without a human in the loop.
«A lawyer has to be compliant. An advice from a lawyer should be fault free. Therefore it is so difficult to just do something. It is not in their DNA."Unlock the secrets to the legal sector's digital transformation with our latest guest, Peter van Dam, Chief Digital Officer at Simonsen Vogt and Wiig. We promise you a journey into the innovative realm where data management and artificial intelligence redefine the traditional practices of law. Peter offers us a glimpse into his professional trajectory from legal tech provider to digital pioneer, emphasizing how data and application integration are revolutionizing legal services.Discover the unique challenges and opportunities that come in a new era of digital sophistication in the law profession. Our conversation dives into the significance of roles like Chief Digital Officer in shaping a progressive future for a historically conservative field. We share stories of how to catalyze excitement for technology among legal eagles and clients alike, and we explore the strategic vision needed to navigate the balance between innovation, confidentiality, and compliance.The episode examines the expanding potential for automation within legal services. Here, the focus shifts to how digital tools enhance, rather than replace, the human expertise of lawyers. Rounding off the discussion, we shine a light on how law firms are upgrading their data access protocols, ensuring that sensitive information remains under lock and key.My key takeaways:LegalTechLegal might seem as a conservative section, but on the insight everyone, from lawyer, to staff to paralegal is working on continuous improvement and growing more and more efficient.Low code, citizen development, hackathons, etc. are ways to quickly iterate on ideas and applying them.Internal and external marketing of the importance of technology in law is important.You have to lift those first step barriers, an get first hand knowledge of using AI and tech, to really embrace it.Document & Content ManagementOptimizing interoperability and data exchange between different document management tools is an interesting journey.There is huge, untapped potential in unstructured data.The biggest challenge for document management is to find ways of cutting through the noise of redundant, obsolete, and trivial data.You need a certain quality of data sources to utilize LLMs and genAI.Methods of AI Governance need to work in concert with classical methods of data and Information Management.Data volumes are growing exponentially, and so does the cost. Records Management is important to structure data, create retention schedules and ensure that datahis available according to need and regulatory requirements.AI and trends in TechnologyFind a way to balance need and investment in a way that you have the relevant tools available when needed but are not exclusively reliant on those tools.Development in technology, data, AI, sustainability, etc. creates more demand for legal services - technological development accelerates legal demand.For the practice of law, human interaction is vital. There might be a more differentiated service offering going forward, but human interaction with a lawyer will still be at the core of the practice.The role of CDOThe role of CDO is challenged, because it can mean so many different things in different environments.A Chief Digital Officer is important to get enthusiasm about new technology and to actually get it implemented and used.Communication is the most important skill and tool.As a CDO or Digitalization department you need to think 6 month ahead, elicit trends and find out what can become relevant for your firm.
«We are going to treat our data at the highest level, making sure that we can use it as a competitive advantage. Then it's a strategic choice.»Unlock the strategic potential that lies at the heart of Data and AI with our latest discussion featuring Anna Carolina Wiklund from IKEA. Embark on a thought-provoking journey with us as we dissect the significance of robust strategies in shaping digital landscapes. From the role of data as the lifeblood of digital commerce to the ways it can radically alter customer behavior, this episode promises insights that redefine the boundaries of e-commerce and digital merchandising. We explore the complex interplay between business, digital and data, revealing how the alignment of strategies across various organizational levels can forge a path to business impact. Learn how a coherent vision can transform not just marketing strategies, but also those of HR and other departments, and the critical importance of shifting from output to outcome-focused objectives to measure success.Finally, we navigate through the evolution of strategy in the face of AI's relentless march, examining the essential need for agility and visionary thinking to keep pace with a rapidly transforming arena. This episode is a masterclass in instilling a culture of excellence, accountability, and collaboration that can propel companies forward. With real-world examples and actionable insights, we offer a clarion call for businesses to reassess and adapt, ensuring that their strategies are not just surviving, but thriving, in the AI era. Join us and fortify your strategic acumen for an increasingly digital future.My key takeaways:«When we talk about product mindset its all about how we work as a team.»It is important to ensure aligned autonomy, when working in a compartmentalized organization with product management. You are delivering a piece to the totality.«Now, we need to have an adaptive Strategy everywhere.»Digital is the totality, the ecosystem that you are creating. Data has to flow in that ecosystem.There is no digital without data, but there is data without digital.People are coming and going within your company, and are bringing data along.One StrategyThe goal of strategy is to create one clear direction for the company.If you have multiple strategies, you will pull people in different directions.Break down strategies in where you deliver the value.Organizational models and actual value creation do not always overlap.There are transversal strategies that stretch throughout the entire organization (eg. HR, product), whilst there are specific strategies that strive towards one goal (eg. marketing).You can no longer afford to have business and digital separated.Digital tools do not deliver any value unless they are part of a process and used by the business.Ensure that you measure that matters, what is the value that you are creating.You need to work on a mindset for the totality of the organization, not a digital vs business mindset.OKRs can help to get that forward leaning mindset and to become more process oriented.The strategic part is really the choices you have, while plan is the actions you take towards these choices.A plan is about creating transparency in the company, so everyone understands what they are delivering and how it fits together.You need to have a goal to work towards. Your Strategy is laying out the logic to get there.«Culture eats strategy for breakfast»
"We believe that by making data more accessible, the city will become more transparent and accountable to the people that we serve."In our latest MetaDAMA episode, we're joined by Inga Ros Gunnarsdottir, the Chief Data Officer (CDO) of the City of Reykjavik, who's at the forefront of a transformation towards data-driven innovation of inclusion and accessibility. She walks us through her fascinating journey from engineering at L'Oreal to shaping the future of data use in municipal services. Her insights reveal how simple text, visuals, and a focus on digital accessibility are revamping the way citizens interact with their city's data.As we navigate the terrain of digital transformation, Inga Ros delineates the distinct roles of a Chief Data Officer versus a Chief Digital Officer, highlighting the intricacies of their contributions to a city's digital ecosystem. Reykjavik's Data Buffet serves as a prime example of how open data visualization platforms can enhance not just transparency and accountability but also literacy in a society hungry for knowledge. She also shares compelling stories of data's impact in classrooms, planting the seeds for a future where every citizen is data-literate.We wrap up our conversation with a deep dive into the nuances of creating data visualization tools that adhere to digital accessibility standards, ensuring that everyone, regardless of ability, can partake in the wealth of information available. The discussion traverses the significance of maintaining the Icelandic language in data communication and the imperative of ethical data collection practices, especially concerning marginalized groups. By the episode's end, it's clear that the key to unlocking the full potential of data lies in the simplicity and clarity of its presentation, an ethos that Inga Ros champions and we wholeheartedly endorse. Join us on this journey to discover how Reykjavik is rewriting the narrative on data inclusivity and the profound societal transformations that follow.My key takeaways:Think about how you make data available - design thinking, finding new was to visualize data is important for inclusion.Its the responsibility of public sector to make as much of their data openly accessible.The role of CDO is important, because you need someone to see the bigger picture and how data effects everyone.Managing data, especially for public services, comes with a social responsibility.The difference between a CDataO and a CDigitalO - data requires a different skill set than digital transformation.Data professionals need to ask the correct questions in a service design process.Data access and ownership should be discussed already at the design phase.People have expectations towards digitalization in public sector: you want to access the data you need at the time you need it, from where you are.«Data is a valuable societal asset, where we all have the shared responsibility to ensure data quality.»Data quality is a precondition for using data to its purpose and its potential.You need to think digital universal accessibility, when it comes to data and visualization.With data stories the city of Reykjavik uses visual, verbal and sound effects to convey messages through data.There is a focus on using accessible language, and to not over-complicate texts.Data, especially in the public sector, has not been collected and curated with trains AI language models in mind.There is a great risk that historical biases and previous lack of awareness is transmitted into our models.Data Buffet:Open data visualization platform and an open data portal.Make as much of the city's data easily accessible.Access to a wide variety of correct and reliable data is an enabler for innovation in societal services.
«Companies are already wanting to position themselves ahead of the legislation, because they see the value of actually adaption best practices early on and not waiting for enforcement.»Prepare to dive into the risk-based approach of legislation for artificial intelligence with the insights of Laiz Batista Tellefsen from PwC Norway, who brings her expertise in AI from a legal perspective to our latest episode. We tackle the imminent European Union's AI Act with its sophisticated risk-based approach, dissecting how AI systems are categorized by the risks they pose.Norwegian companies, listen up: the AI Act is on its way, and it's time to strategize. We discuss the necessary steps your business should consider to stay ahead of the curve, from embracing AI literacy to reinforcing data privacy. Laiz and I dissect the balance between innovation and risk management, and we shed light on how cultivating a culture of forward-thinking can ensure safety doesn't come at the cost of progress. This segment is a must for businesses aiming to turn compliance into a competitive edge.Zooming out to the broader scope of AI governance, we offer advice for maintaining the delicate dance between compliance and cultivating innovation. Discover the vital guardrails for capitalizing on AI's potential while readying for the unknown risks ahead. We peel back the layers of the AI Act's impact on the legal sector, unearthing the nuances of intellectual property rights and data transfer laws that could reshape your organization's approach to AI. Join us for a conversation that promises to leave you not only prepared for the AI Act but poised to thrive in an AI-centric future.Here are my key takeaways:Looking at AI from a risk perspective is the right way to tackle the challenges within.Risk based approach makes sure that development is not freezed.Our job as experts in the field is to demystify compliance within the use of AI systems.Find the right balance between compliance and innovation, by assessing potential risks."The AI Act is part of the European Digital Strategy and is the first comprehensive legal framework for AI in the entire world.»CE marking forces you to have constant monitoring and compliance of the system, as well as registration in a register.Have a holistic approach to AI: How does it fit in the wider setting of my company, both from a data, business and cultural perspective?There are big differences in companies maturity to operationalizing AI for value creation.The focus on risk and safety does not correlate to the need for speed in AI adoption.It's not about starting from scratch, but about understanding the actual use-cases and needs.The AI Act can foster innovation, because you know what your framework is."Make sure that the date you are using reflects the diversity and the reality of the people and situations that the AI system will encounter."Observe and control data quality and distribution continuously.What to consider now:Make sure the company has very good control of known risks, like privacy.Make data risk awareness part of your culture.Understand roles and responsibilities in our organization towards data risks.Have your policies updated.Ensure your stakeholders are well trained.
«The journey Software development went through during the last 10 years, working towards DevOps and agile development, is something that we can really benefit from in the data space.»Uncover the synergy between agile software development and data management as we sit down with Alexandra Diem, head of Cloud Analytics and MLOps at Gjensidige, who bridges the gap between these two dynamic fields. In a narrative that takes you from the structured world of mathematics to the true data-driven insurance data sphere, Alexandra shares her insights on Cloud Analytics, Software Development, Machine Learning and much more. She illustrates how software methodologies can revolutionize data work.This episode peels back the layers of MLOps, drawing parallels with the established tenets of software engineering. As we dissect the critical role of continuous development, automated testing, and orchestration in data product management, we also navigate the historical shifts in software project strategies that inform today's practices. Our conversation ventures into the realm of domain knowledge, product mindset, and federated governance, providing you with a well-rounded understanding of the complexities at play in modern data management.Finally, we cast a pragmatic eye over the challenges and solutions within data engineering, advocating for a focus on practical effectiveness over the elusive pursuit of perfection. With Alexandra's expert perspective, we delve into the strategy of time-boxed approaches to data product development and the indispensable role of cross-functional teams. Join us for an episode that promises to enrich your view on the interplay between software and data.Here are some key takeaways:There is a certain push in the insurance industry towards data, AI and autiomation.Gjensidige has over 20 decentralized analyst teams.Data Mesh is about empowering analyst teams to take control over their data.By taking responsibility over their own data, analyst teams take off the load from Data engineering teams, so they can focus on the tricky stuff.MLOps, DataOps, or classic DevOps in the Data Space is about using System Development principles in the Data Space.The questions that arise within data today, are questions that software engineering went through 10 years ago.Software development also went through a maturing, that brought forth a domain driven focus, best practice focus, product thinking, etc.Documentation should live, where the code also lives. It should be part of the code.Introduce more software development best practices into the data teams.Do not think about the solution you want to develop, but the problem you want to solve.Time-box exploratory efforts into sprints.The pitfallsSoftware Development Lifecycle vs. Data Lifecyle – they overlap, but there are clear differences, especially in the late phases.Feature-driven (or functionality-driven) vs. Data-driven: Is there a problem with software engineering mindset in data?Hypothesis - Data Science vs. Engineering mindset: Explorational vs. structural thinking can cause frictionEnvironmental challenges: How does Test-Dev-Prod split fit with data?
«I took the time to actually go through all of my notes, all of the training courses, all of the things that I looked at over the past 30 years of work. And I thought, I want to give myself a reference book. Wherever I go, I have this single thing that will have enough information to remind me of stuff I need to consider. This is now my book of Patterns."Get ready to have your perspectives on data management revolutionized! This Holiday special serves up a treasure trove of insights, as we dive deep into the interconnections of data, information, knowledge, and wisdom. We'll be shining light on the importance of quality data and the emerging role of data officers in organizations, challenge conventional thinking about systemic behavior changes and their impact on data management, while also stressing the utmost necessity of experimentation and testing to comprehend the ever-changing data patterns.I was lucky to pick the brain of the experienced data expert Jonathan Sunderland, whose career has spanned an array of industries and roles. The conversation is a call to arms for organizations to have clear purposes and goals when striving to become "data-driven." Plus, you'll get an exclusive peek into our guest's impressive "book of patterns" project, which promises to be an invaluable reference for future endeavors.This is a thought-provoking exploration of the fine balance that large organizations need to strike between agility and long-term goals. We'll confront the dangers of resistance to change and the pitfalls of a myopic focus on quick wins, offering insights on how to foster a culture of innovation without falling into the trap of over-optimization or outsourcing purely for cost reduction. Moreover, we'll dive into the world of data governance, discussing its crucial role in fostering trust with data and facilitating informed decision-making. Finally, we distill the essence of personal growth into three potent rules of challenge, enable, and inspire. So, what's your capacity? How can you elevate it to tap into your fullest potential? This episode inspires to ponder these questions and propel your personal and professional growth.Happy Holidays!
«Sentralt I dette med å skape verdi er tverrfaglighet og involvere hele bedriften, ikke bare et lite Data Science miljø.» / «Central to creating value is multidisciplinarity and involving the entire company, not just a small Data Science environment.»Prepare for a journey into the landscape of data strategy with seasoned Data Scientist, Heidi Dahl from Posten Bring, one of the largest logistics organizations in Norway. She is not just engaged in strategic discussions about data, AI and ML, but also a passionate advocate for Women in Data Science, took the initiative to create a chapter of WiDS in Oslo, and co-founded Tekna Big Data.In our chat to understand the dynamics of data science and IT, we talk about their balance between research and practical development. Heidi articulates the urgency for a dedicated data science environment, exploring the hurdles that organizations often confront in its creation.We cross into the world of logistics, shedding light on the potential power of data science to revolutionize this industry. We uncover how strategic use of data can streamline processes and boost efficiency. Finally, we underscore the importance of nurturing an environment conducive for data professionals to hone their skills and highlight the role of a data catalog in democratizing data accessibility.Here are my key takeaways:Digital Transformation of Posten BringAn organization that is 376 year old and has been innovative throughout all of those years.The Data Science department was stated in 2020 under Digital Innovation, now a part of Digital technology and security.The innovative potential is found through use-case based work closely integrated with the business domains.There are several algorithms that made their way into production, and that is a goal to measure against.The Data Science teams consist of cross-functional skillsets, bringing together Data Science, Developers, Data Engineering and Business users.The exploratory phase is vital, but has to have a deadline.IT driven development projects do not always match with the needs of Data Scientists.Data and IT need to work together, but for exploratory work, Data Science should be able to set ut needed infrastructure.On cloud infrastructure it can be vise to think multi-cloud to ensure availability of a specter of relevant services.Posten/Bring is looking to build a digital twin for their biggest package terminal for better insight, control and distribution of packages.Strategic use of dataHow can we use data to make better decisions, be more effective and smarter?The 4 core elements of the Data Strategy:Establish distributed ownership of data and data productsIncrease the amount of self-service.Build competency tailored to your user groups needs.Strive towards the goal of great services and products based on data for your users and customers.Role based self-service capabilities .A data catalog is discussed, to gain a better understanding of the data available, security, but also context of origin and data lineage.A data catalog needs to be able to serve different user needs.CompetencyThere are three perspectives:How to recruit new and needed competency?How to train and share competency internally?How to retain competency?Data Engineer is a newer and more specialist role, that is hard to find on the market.You need to give your data professionals the possibility to do purposeful work, bring into production and connect to value creation.The entire organization should be aware of how to use data to make work more efficient and smart - think data literacy
«A combination of strong buy in from top-management and strong flow of change agents (…) is a requirement for succeeding.»Eager to unlock the secrets behind building a trustful relationship with AI systems? I am sitting down with Ieva Martinkenaite, head of Telenor's Research and Innovation department to shed light on the interplay of accountability, ethics and AI technology. Through her role as translator between tech, leadership and business , Ieva brings a refreshing vantage point to the dialogue, providing a unique bridge between the tech and business spheres.We're taking a deep dive into the creation of responsible AI within an organization. Our conversation explores the firm foundation of clear values and top management's proclamations, to cultivate a bottom-up process for a governance structure. Understand the three-layer structure of AI governance and the imperative of expert support for data professionals. Plus, we'll be scrutinizing how adopting responsible AI as a core principle can fetch a positive social impact.In the finale of our discussion, we underscore the essence of responsible AI use and the value of investing in data professionals. Discover how individuals and companies can not only fulfill, but surpass compliance standards. Remember, it's not just about employing AI responsibly but about finding a responsible approach that fits you as an individual and your company. Here are my key takeaways:The two scenarios of concern with AI in Telecom:Missing out!Messing up! Telecom still needs to catch up, but with a string focus on using and scaling AI technology.The biggest differentiator in the sector is applying methods and technology to provide the best customer service.To scale AI, you need to have very solid data capabilities .Cloud native data platform with various continuously upgrading technologies.Efficient and scalable storage and processing capabilities.Data Governance structures to ensure accessibility and use of data in a secure, privacy friendly, ethical way.You need that foundation before you can start building advanced AI capabilities.Apart from data you need people who are data literate and technical adverse.A strong data culture is important, not just for the data experts in your organization, but for everyone.Responsible AIResponsibleAI should be build on a solid Data Governance foundation.The biggest concern of executives with AI is the lack of traceability with data.We need to a) understand what are the risks, b) create responsibleAI by design.Executive support and belief in the AI journey is key.Data professionals have a responsibility to communicate complexity, translate and apply their knowledge to ensure a more general data literacy.You should do anything possible to be able to explain how your models work.You need to ensure that it is save to talk about, also not understanding systems.Steps to building Responsible AI Governance:Decide as an organization on your core principles / value - how may they be challenged by AI?Define your principles / values for AI - these should be AI specific, but adopted to your setting, concerning risk, portfolio, etc.Make these principles / values actionable.Seek endorsementBuild a Governance structureEnsure training and awarenessPositive Social ImpactCompanies should feel a social responsibility to go beyond what is required to build better, more ethical systems and use of those.Ask yourself, why are you doing responsible AI and Governance? For compliance obligations or do you what to go beyond that to build based on high ethical standards?
«How well are we rigged in Norway to handle this?»What a fantastic talk - With so much happening in Norway in autumn 2023, I brought on Alex Moltzau for a chat in AI policy and Norway. Alex Moltzau is a Senior Policy Advisor at the Norwegian Artificial Intelligence Consortium (Nora.ai), and one of the most outspoken experts on AI policy and ethics in Norway.Throughout the last years, there has been a significant change in public attention to AI, even though AI has been part of our lives for quite some time.There is a great AI community in Norway, with great research that is done.What is NORA.ai?NORA is a Norwegian collaboration between 8 universities, 3 university colleges and 5 research institutes within AI, machine learning and robotics.NORA is strengthening Norwegian research, education and innovation within these fields.NORA's ambition is International recognition of Norwegian AI research, education and innovation.NORA's vision is excellence in AI research, education and innovation.NORA is active both in the Nordics, but also collaborating broadly on the international stage, like exchange programs for Ph.D. students, collaboration with other national institutes, contribution to eg. OECD, even contributing to shaping bi-lateral agreements, +++Why AI policy?There is a growing concern in society about AI and its impact on our lives, how it affects elections, misinformation, our workHow can AI help us to handle information on our citizens more effectively?How does AI affect our children, their learning?There is a misconception, that we don't have sufficient regulations for AI. Existing laws apply to AI as much as to other methods and technologies.What kind of infrastructure do we need to build in society? Is language an important infrastructure for our society?What is the public infrastructure, the public good we need to invest in as a nation?State of AI in NorwayWhat Government mechanisms are we going to build to handle artificial intelligence?There are three major announcements that have shaped the state of AI in Norway during the last weeks and months:The AI Billion: The Norwegian Prime minister has announced that the Norwegian Government will invest 1 billion NOK in AI over the course of 5 years.The Ministry of Defense has published their AI strategy.A new Ministry of Digitization and Governance has been established in the Norwegian Government, with responsibility of AI.Internationally there are two concerns around AI that are predominant:Security - how to ensure cyber security and reliability in models.Bias - how to tackle bias in AI systems, work with fairness and trust.We need to ensure that possibilities through AI configure to our Norwegian society.We need to think about the values we have build our society on, and how AI can support these values.Norway is earlier than most countries on actively working with regulating AI, eg. in relation to privacy.AI is about implementation - it is about trying, failing and trying again.We need to minimize possibilities for disaster, by taking learning from other countries.There need to be mechanisms to ensure that the cost of compliance with regulations is not too high.The role of Data ProfessionalsWe would love to see data folks should take a more active role in society in regards to help everyone to understand the challenges within data and AI better.Data Management professionals can ensure safety and trust in our society going forward, and should therefore have a more active role in politics.https://www.nora.ai/
«I think that having a very good framework, where you can put all ML and AI in, makes it much easier, much more clear. (Jeg tror det å ha et veldig bra rammeverk, der du kan putte all ML og AI inn i, det gjør at du får det mye lettere, mye mer oversiktlig.)»Frende Forsikring, a Norwegian Insurance Company has build up a team of 6 people that work with Machine Learning (ML) and Artificial Intelligence in the company. Their goal is to ensure the companies growth through automation. Anders Dræge is the Head of the Machine Learning and Artificial Intelligence team at Frende Forsikring and he has always had an interest for data and automation.Anders is not just an award winning Data Scientist, one of the Nordic 100 in 2023, but also a person that is happy to share his knowledge.The goal for AutomationAutomation is a target that can be measured againstYou can measure both, time saving as well as saved costHigh-risk items are a good use-case to show the effect of ML: Its not necessarily about replacing work tasks, but to ensure that human focus in on the items that are of highest risk and valueAutomation is a way of scaling and growing your business, without increasing resources.The need for automation becomes more clear, and to avoid over-allocation of resources, the need is evident in the business.Your goals fro AI and automation have to be aligned with your organizations business goalsThe composition of the teamThe Machine Learning team is 6 people string, consisted of2 ML engineers2 are 50% actuary (domain knowledge connection)1 data engineer that prepares data 1 MLOps developer with interest in ML to build connections with IT departmentClose collaboration with RPA (Robotic Process Automation) team and other departments.The processThe trinity of data in ML is paramount for quality results:1 set to train1 set to validate1 set to testThere are possibilities to automate testing proceduresMonitoring can and should be automatedThe technological frameworkFind a framework that can control your processes, detect deviations and monitor effectively.Implementation and setting things in production is much more efficient with a proper frameworkFind a standard way of operating, will also have a positive effect on on-boarding new peopleKey factors for success«One factor that was decisive for a very good collaboration across teams and departments is that we are very close. (En faktor som var avgjørende for et veldig godt samarbeid på tvers av team og avdelingene, er det at vi sitter veldig nært.)»Physical co-location is a success factorA lot of key competency is in-houseClear and transparent message on automationA culture that is actively striving for automation, finding ways to improveCulture is really important: People have to be receptive to the ideas of automationFind the right time to talk about automation - ideally before the need arisesHuman in the loopMonitoring of process output by humans is important for most ion the processes. This is about evaluating output with expectations from human experienceHuman evaluation becomes input for re-training of the modelThe use casesAutomatic email distributionProcessing of physical mailMonitoring of lawsFor the work Frende Forsikring has done with Natural Language Processing (NLP) for email distribution, the team won the Dataiku Frontrunner Award 2023.https://www.frende.no/aktuelt/frende-vant-internasjonal-ai-konkurranse/
«There should be very little reason to say: Hey, I need a human to look these operational things for me. They are all defined as code.»Lars Albertsson has a long career in Data and Software Engineering, including Google and Spotify. Lars is on a mission to spread the superpowers of working with data, with the vision to: «Enable companies outside of the absolute technical elite to work with data with the same efficiency or effectiveness as the technical elite companies in an industrial manner.»4 types of companies: Born digital - Data is the basis of their business model.Born digital in a traditional market - completely natural to use data as a competitive advantage.Traditional industries «born before the internet» - big difference wether they handle information or are in the physical world.Information Handlers - Banks, Media, etc have digitalized their whole activity chain a long time ago.The differencesSignificant differences in cycle-time in different industries and businesses.The only way to beat this cycle is to try out, fail fast, learn, try again.«Successful companies have been really good at failing fast.»Fast moving cultures are more effective and therefore have a better risk focus, without slowing down.To move fast in a slow moving industry, you need to choose your technology and approach wisely, keeping complexity down .Cultural slowness - «The challenge to change the way people work and people think is extraordinarily difficult.»Risk and Governance are addressed by rituals, rather then tasks.The value chain data to client outcome, needs to be anchored in a company. Have a clear picture of what this means.Getting closeSuccess can be measured by how close you are to the end user. The closer you get to a customer, the better the changes of success.«There is no substitute in value creation, than talking to the people you actually want to make happy.»Automation is InnovationYou need to find ways to ignite people's domain innovation capacity.Automation is a gradual process. People don't loose their work to machines over night.Human-oversight is still really important, and there is a long journey with humans as part of the process.The focus on automation now is in knowledge workers, yet those have a different stand in society and are able to resist better, compared to the workforce during the Industrial Revolution.«If it changes quicker than one generation, there won't be natural attrition that matches the changes in the need of the workforce.»Automated Data ManagementAutomating and industrializing data management processes is lower risk then software development, but still not as common.Great value to gain, from delaying simple automation processes to data management.You need to build everything from raw data to end product to find ways to automate.The raw data is the soul of the end product and the other way around. You need to keep these two outer points of the pipeline in mind, when think of data quality and data products.The limitations in Hadoop forced to work in a certain way. That way can be adopted to data management.Hadoop really pushed people in the functional Big-data patterns, that are still the basis of much of the work we are doing today.Workflow orchestration can help to know, which data you choose for a certain computation.Data Management as code is an area that is underdeveloped and under-appreciated.Minimize the technical barriers from Governance, and focus on the social aspects.Ford CEO on Software: https://www.youtube.com/shorts/HrNN6goQe50
«How do you develop good procedures around testing or how do you drive experimentation in a product or business setting?»Carl Johan Rising works as Director of Data at Too Good To Go, a marketplace that enables food businesses to sell their surplus food instead of throwing it in the bin.We talked about how to form a product team, and how to rethink the role of Data Scientists in your team, shape it in an embedded team, close to domains and with expertise and customer focus. We talked about skills, product focus, business partners and much more.«(A career in) data gives you a bit of everything.»Data is a nice intersection between aspects of business, academia, physiological problems, and technical challenges.Business - especially understanding and decision making Academia - working with hypothesisData at Too Good To GoIf you look into how you want to use data to really drive decisions, it becomes more of a change management challenge, and not just a technical challenge«Start with a proper infrastructure foundation - a good clean data model»«Foundation building is invisible, and doesn't by itself bring business value»The business sees the data team as one unite, without distinction between different capabilities in the team - Therefore the expectations are different«Make it very explicit what people can do and what their capacity is.» - gain understanding businessProduct Analyst:«And soon as you have any emphasis on product, its development, its iterations, then it makes sense to have Product Analysts.»Too Good To Go works with multiple Product teams, each with their own problem specin a Product team - Product Manager, Designer, Engineering Lead, Engineers, Machine Learning Engineers, and Product Analysts embedded in the teamProduct Analysts in each team to drive good identification of problem spaces and to enable the teams to do rapid experimentationThe role will ask the "how do we drive good?" and set an experimentation agenda - driven by the Product AnalystEmphasis is on statistical knowledge and technical skills.There are two main stakeholders for Product Analyst -> Engineering Lead and Product ManagerFocus on gathering the best resources to tackle a problemWhat skills and experiences do you need in a PA role?statistical knowledgeprogrammingunderstanding of tech. aspectsability to explain results«I think the role of Data Scientist can mean a lot of different things.»Be a bit more explicit about what the work is and what it entailsMinimize the possible confusion between expectations on Data Scientists In a companyData Analytics Business PartnerAn embedded role that is part of the business with co-ownership of the outcomesThrough this partnership it is much easier to gather context if you work with the domain
«We (DAMA) have a role to play, (…) develop the Data Management profession ultimately for the benefit of the society.»It is good to be back with Season 3 of MetaDAMA, and as always, we start with a DAMA-focused episode.Nino Letteriello is one of Europes most influential data leaders, president of DAMA Italy and Coordinator for the DAMA EMEA region (Europe, Middle East, Africa). Nino started his carrier in project management, educated in civil engineering, and got involved in Data Management around 2017. Since 2019 he is the regional coordinator for DAMA EMEA.Here are my key takeaways:What is so special about DAMA EMEA?Lots of passion and commitment by volunteersGiving back to society to proff how the society becomes more data literate«Whilst we are a geographical region, I see very very different scenarios, very different levels of maturity.»Middle East - Saudi Arabia:Government drove a framework, build on DMBoK for public administration This is cascading down from public agencies to the big corporations working in Saudi Arabia, and subsequently to SMEAfrican Countries show a scattered approach to Data ManagementGreat appetite for knowledge in dataThere is an enormous sense of «missing out»Mediterranean Countries and Central EuropeInitiatives of «data alphabetization» or data literacy at an early stageTeaching data management at an earlier age, eg. Program in Italy to teach DM in middel school and high school (DataHigh)NordicsDifferent level of maturity.Staring early with digital competency developmentInspirational is the nordics view on data for social good and ethical handling of dataData Literacy and awarenessNino was involved in a WEF (World Economic Forum) study on how SMEs (Small, medium sized enterprises) are leveraging the power of dataCollected information form over 200 Small and medium sized enterprises«Interesting how many companies still consider data an IT thing, a subset of IT.»There is still an over reliance on IT, not seing data as a business problemImmature on Data Governance and formalization of rolesAwareness is not necessary followed by clear actionSMEs face the same issues as big corporations, but without the means to handle these issues accordinglySMEs have the possibility to me very agile in facing these issuesStill a lot of «reinventing the wheel» - we should use DMBok and other existing frameworks actively as a basis to work fromIs DAMA still relevant? Importance of DAMA and DM is still large, also and especially in times of AI«Garbage in - garbage out» is still as valid as everyNew methods, new techniques, how languages, everything is dependent on the quality of the data you put inEverything starts with awarenessThe real differentiator is that data is a business asset, not an IT assetDAMA EMEA ConferenceOrganized for the third year in row, first time both digital and physical in Bologna November 29th - December 2nd 2023.Conference provides clear, filtered, categorized, relevant informationPossibilities to share ideas and networkClosed session for all board members in EMEA region to discuss a declaration of intents for DAMA EMEANino likes the idea to do Data Management Maturity Assessments across countries tom compare DM maturityThere is also a possible intent to work closer with European CommissionGiving DAMA EMEA a vice towards legislation makersGet more information and sign up here for the conference: https://data-emea.org
"It's about taking a step back to ask yourself: Should we even have a data-driven system for this?" («Det handler om å ta et steg tilbake for å spørre seg: Skal vi i det hele tatt ha et data-drevent system på dette her?»)We finish season 2 with a though-provoking episode, to maybe start som debate about data-driven public administration.Lisa Reutter is PostDoc at the University of Copenhagen connected to a project called: «Datafied Living». We talk about the importance of Social Science in Data, and how data is intertwined with our lives. Lisa is researching in the field of «Critical data and algorithm studies», at the interplay between tech, data and society.Here are my key takeaways:Data in Public AdministrationFor a modern state to function properly and to ensure citizen rights, services, security, etc is provided the state needs data.Data Management by the state for its citizens is not a new concept but has a long historical foundation.During the last years we use more, different and new data in administrative processes, and enhance technological development and a tool box to derive value from dataPublic administration has had a monopoly over management and ownership for citizen data. But this has been challenged by private companies.Data-driven systems in public sector are not there for profit, but to create value for society. Therefor they need to be build on and with the purpose to enhance our democratic values.RegistersNorway and other Scandinavian countries have established national registers to manage and administrate society.There is a reason why registers are not unified in Norway, and this is to ensure a balance of powersThe opposite example, of what can happen if a state collects information on its citizens without boundaries, is to be found in the GDR (Eastern Germany)If all data of all aspects of your life are collected one place, it is really easy to misuse this dataThrough data a state could see, predict, and control the behaviors of citizens.The public debate about data-drivenDiscussions can and should be about what data are we collecting, where do we store data, what are we using data for, who could and should have access to that data, etc.Even with public debate about data use in public administration, limits and boundaries can never be defined clearly. Also because this is individual and relative to context.Datafication is a political act. The citizens need to be involved in the process of technological advancement and intelligent use of data.The debate around «data-driven public administration» in Norway, has not included the public actively.Customer-centric vs. data-as-an-asset vs. democratizing dataIs there a rhetorical ambiguity between being customer-centric and data-as-an-asset?Data democratization demands that citizens have to use their time, resources and energy to ensure that public administration is working correctly.Is making data available leading to commercial parties capitalizing on that data and building solutions, rather than creating transparency for citizens?The right education and skills are important, but it needs to be available and attainable for all parts of society.Data Literacy is an own subject that is in dispute about what it should contain.We need to understand, that this has implications on how we...1. Trust in the state2. Trade - what do I give my data for? What do I get in return?3. Build in accepted ways4. Weight opportunities against risk5. Ensure that the responsibility for understanding does not lie with the citizen alone6. Gain knowledge, and how everyone can get it7. Should invite for debate
«How can we consolidate data and describe it in a standardized way?»Scientific Data management has some unique challenges, but also provides multiple learnings for other sectors. We focused on Data Storage and Operations as a knowledge area in DMBok. A topic that is often viewed as basic, often not in focus, but is a fundamental part of data operations.I talked to Nicolai Jørgensen at NMBU - Norwegian University of Life Sciences. Nicolai has a really diverse background. His journey in data started in 1983! In his free time, Nicolai spends time with photography and AI for text to image generationHere are my key take aways:Scientific Data ManagementTo describe data in a unified way, we need standards, like Dublin Core or Darwin Core for scientific data.Data is an embedded part of Science and Research - you can't have those without data.You need to make sure you collect the right data, the right amount of data, valid data, +++You need to optimize your amount of time, energy and expenses when collecting and validating data.You need to standardize the way you collect data, to ensure that it can be verified.There needs to be an audit trail (lineage) between the data you have collected and the result presented in a publication.Data needs to be freely available for research and testing hypothesis.Data needs to be findable, accessible and interoperable, but a also reusable.ML algorithms can help extract and find changes to scientific data, that is internationally available.Describing data is key to tap into knowledge - for that you need metadata.In times of AI and ML, Metadata is still the key to uncover data.The development of AI models is a race - maybe we need to pause and get a better picture of cause and effect, and most of all risk.Standardizing InfrastructureHow can were standardize on the infrastructure for research projectsMinimize or get rid of volatile data storage and infrastructureStandardize data storage solutionsSecure what needs to be securedSplitt out sensitive or classified data and store separate (eg. Personal data)Train your end users and educate data stewardsHave good guidelines for researchers on how to store, use and manipulate data.There is a direct correlation between disc-space use and sustainability.Storage is cheap, is a correct saying, if you look at its in isolation - but in the bigger picture the cost is just moved.Just adding more storage doesn't solve your problems, it might just yet increase them.Long-term Preservation & IntegrityTo preserve data for long-term you need to Encapsulate data at a certain levelStandardize the way you describe the dataUpload data package to a common governed platformEnclose if there is a government body that can take responsibility to preserve your data for the time necessaryEnsure that metadata is machine-readableFormats like XML provide the possibility to read the data by both machines and humans Research integrity: conducting research in a way which allows others to have trust and confidence in the methods used and the findings in that result.Ensure lineage and audit trails for your scientific data.Fake data, data fabrication, are serious issues in research - the understanding and methods for keeping data integrity at the highest possible level is not getting easier, but increasingly important.Changes to data (change logs, change data capture, etc) can be studied as well; you can build models to build scenarios around data changes.You can fetch data from other sources to enrich the quality of your data.
«Sometimes, you look at the problem and think it's valuable, you spend a lot of time on it, and then you find out that this is really a NO!»Ida Haugland and I talked about the importance of Data in Shipping, about a concrete examples for success through digitalization and mainly about the focus on Product Management.Ida works as Principle Product Manager at Omny. At the time of recording Ida was part of Klaveness Digital and is presenting from her perspective from her work there. She started here career with data, while working for Customer Satisfaction for Facebook in 2010.Here are my key takeaways:Shipping & LogisticsWe are talking about really complex lines, that need to be managed. Data is the basis for everything.In Shipping and Logistics it is vital to «Put together a comprehensive plan to see cause and effect.»Shipping and Logistics is on a way to detailed insight in to their supply chain, as much as logistics for finished good.« A Lot of principles and thinking behind other products can be applied to shipping.»Carbon Emissions and the role of DataCarbon Emissions can be minimized based on data driven insights.Concrete actions based on data:Slow down - timing shipping and arrival at port with possibility to embark and birth the shipload.Use bigger vessels - more cargo, less voyages.Use the full tonnage of the vessel - fill it up to capacity.Reduce ballast distance - find vessels close to you, to reduce empty voyages.Digital Product ManagementThere is a lot of hype around product management in data and digital.Product thinking means turning your attention to the user/consumer of the product and their needs.You need to collect data, ensure quality, display its in a way that makes sense, and display it in the right context - that means products.There has been a development in product management to become more data informed and customer focused.There are still misinterpretations between product thinking and project thinking.Projects have a set scope and an end-date. Projects are defined by their constrains, on scope, time, or quality.Product development does not have nan end-date and cannot be done in isolation.In Product management you have constrains on features, not on the entire product.In Product management you can deliver a lot of value, but you don't always know when.You can apply Project Management methods in product management.When you deliver a product you need to have your consumer, customer and end-user needs in focus throughout the entire development of that product.Separate your own opinions and preference out of the product during development.Don't just ask what products people want, but rather elicit requirements through a controlled process, to find out what they need.A product manager needs to facilitate and get customer, designer, software engineer, etc. on the same page.Skills for a Digital Product ManagerUser empathyStorytellingElicit needsData AnalysisCross-departmental communicationAnd the Number One rule as a Digital Product Manager: «Don't mistake your own opinion for the right opinion.»
«When technology evolves really fast, also the skills you need to hire for evolve really fast.»Within a rapidly changing environment, fueled by technology and great ideas, it can be hard to define a stable career path. So I brought in an expert on developing companies and building legacies. Pedram Birounvand has a background in quantum physics, data engineering, experience from Spotify and moved into private Equity 6 years ago. Now Pedram started a new chapter in his career as the CEO of a startup working with Data Monetization.Here are my key takeaways:Data Skills for the FutureAs a leader in the data domain, you need to be a storyteller, to tell the story about the necessity of data, like quality or governance.The Job titles for Data Scientist, Data Engineer, etc stayed consistent, but what we expect from someone with that job title changed greatly through the last yearsHard skills in data are not as important as they where for every companyMake sure you know, what you are optimizing for in your careerAre you optimizing for flexibility, self-employed consultant is bestAre you optimizing for building a legacy, be an entrepreneurAre you optimizing for leading people and see people grow, become a line managerDont become a manger if your passion is within engineering. You will need to optimize your time for coaching people, not working on problem solving as an engineer.«The technology of applying AI and ML becomes more and more simple and becomes more and more commoditized.»Dont hire Data Scientist to build models that you can buy out of the box.Don't hire Data Scientist if you need Data Analysts. They work entirely different and the work is not comparable.If you hire a Data Scientist before having good Data Engineers, then the Data Scientist cannot create any value«In order to be successful as an engineer, you need to have a really transformative mindset.»You need to enjoy the learning process, if not focus on something else in the IT-domainAdopt an agile mindset. Agile fundamentals are key to todays work life.«You need to embrace to be able to incubate things.» Build incubator squads as soon as a good idea pops up.RecruitmentIn a small company you need to be much more flexible and broader in the way you tackle problems, than in a larger company where you can be more specializedAs a hiring manager, don't lean too much on the titles, but make sure you understand what you need in your company. This is key to writing a good add and attracting the right talentIn a job advertisement be specific: What does it mean to be a Data Engineer in the context of your business?What is important for you as a company today, based on the trends coming?Building code has become so much simpler. Do you still need developers that need to know all the details about a certain language?Maybe a person that can be close to the business, and not so deep in programming can add more value?«You have to know what it is you are optimizing for.» If you have an extremely complicated technology stack you need deep knowledge, if not don't hire it.In a recruitment process, focus on soft skills of rapid learners that can adjust to new situations and having an interest in understand your business use cases.«My interview secret: Share a whiteboard session with me»Try to figure out how self-sufficient a candidate can be in understanding how the business works and where to get relevant dataTest how candidates react in uncomfortable situations, with customers that are not always happy about results and solutions.Look for candidates that show resilience in new and uncomfortable situationsCareer models should be technology agnostic
«Infants with guns!»Are we mature enough to track, collect and handle data responsibly, according to ethical standards? I talked with Director of Data Innovation at IIH Nordic Steen Rasmussen about the Business impact of Data Ethics.Here are my key take aways:If were track, collect, and keep all data for any random opportunistic purpose, we put your companies at risk.This includes a «commercial curse» of budget-heavy tracking and budget-light management and business value creation through data.ROT, Norwegian for clutter, is an acronym for Redundant, Obsolete, Trivial - the dat that clutters your way to find valuable data.Collection and tracking of data is still too much dependent on people: If there is a change in personal, you get situations where new people «clutter the clutter»Marketing & SalesFor many companies it was Marketing & Sales that drove the data-driven agenda.The big value of Marketing & Sales is to add the market-dimension to the data.You can actually relate your product to the market, ship to where the market is.Analyzing market data is «putting a fixed entity on a moving target». The market changes to rapidly to provide good analysis.The more you push behavioral forecasting into the future, the bigger your uncertainty.Business value & EthicsCorporate irresponsibility is an issue.Sometimes we get involved in a projects for the projects sake.Data projects have for a long time been theoretical, so the impact was not visible.Chat-GPT is a black box. Should we really give it more firepower, if we don't know how it works?The market determines that there is a business value in being first.The speed of innovation doesn't give time for reactive regulatory bodies to regulate efficiently.Companies need data ethical guidelines to say, how they will, shall and can use data.Who should define data ethical guidelines in a company? It is still done on a user-level, whilst senior management is looking at market situations and weighting them against ethical guidelines.We need regulatory and top-level guidelines that cannot be bent according to market situations.«Ideally, but highly unlikely we need a global set of data ethical guidelines.»The more trustworthy you are as a company, the more relevant data is shared with you.With that trust and data, you can understand the market better than companies that are not trustworthy and basically flying blind.Personal Data Literacy is important, and we need basic digital skills in our society.There is also a lack of understanding, when it comes to measures set in place for peoples benefit, eg. Cookie-banners.We are still lacking good privacy approved alternatives to the tools we are using on an everyday basis.Everyone has to follow ethical guidelines. We cannot have a DarkOps department in our company.Data Ethics guidelines should be something everyone can refer to.Ask yourself: What is the minimum of data we require to collect? Anything else should become an ethical question.Data Protection Laws:There is a difference between the interpretation of regulations in the EU.Nordic countries interpret law relatively, is this just, fair, reasonable?Southern European countries use a more napoleonic or dogmatic approach, where «the law is the law, and the law must be obeyed.»Both Chat-GPT and Google Analytics have been handled differently by data protection authorities.Data Protection Authorities generalize to much, and don't look at differences in technology.Is a strict, generalized interpretation creating panic, for users of eg. Google Analytics?
«There is a fundamental conflict between the essence of fashion and Machine Learning.» Fashion is always about change, innovation and identity. Whilst ML is good at making predictions based on historical patterns on those things, not change. How do those go together?I had a fantastic chat with Celine Xu, who is Head Data Scientist at H&M Group, with a mixed background from Applied Mathematics and Business (MBA).Here are my key takeaways:Three things Data can achieve:Automation of systems, where we have a clear understanding ion the process.Sourcing, filtering, ranking of data input for decision making.Ad-hoc analysis of all kinds of AB testing or correlation relationship.Focus for ML in Fashion«ML can be embedded across the supply chain from product design to customer.»H&M focuses ML on two areas:Future proof for customer experience: search experience, personalization, fashion inspiration, etc.Demand-driven supply chain: reduce cost and be more sustainable through demand forecasting, cruise control, logistic optimizationML in Product Design (some examples)3D modeling can make the customer more comfortable in a visual way, in choosing the right size.Digitalizing fashion sampling is an interesting use case.Early detection of format, color, etc. that can generate trends.Optimizing logistics and storage through demand forecasting.Shop assistance for employees, with sales data, inventory and warehouse information.Language to image: use of stable diffusion for design patterns, on the material level.H&M is experimenting with trending words on TikTok or Instagram and transferring them to production design.Focus is not in understanding a sentence or the sentiment behind it, but instead use hashtag or product description to find out, what are the trends and link those back to a certain garment feature.Purchasing records (when and where did you buy what) and web viewing records (devices uses, pages that lead you to your purchase, etc.) are vital pieces to collect for behavioral analysis.The Challenges of ML on behavioral and cultural dataML models are often evaluated on ML metrics, like accuracy, recency, etc. but not really business metrics, like revenue, profitability, etc.You fashion taste is influenced by your music taste, your interior design, and so much more.GDPR or webside viewing regulation are limiting the data source availability.Even harder to get the data accurate on a granular level, eg. Do you buy for yourself or doing you purchased a gift?With a lot of different dimensions in the data, you need to balance the accuracy with the cost.ML is always learning from something that already happened. If a new situation accrues, there is a lack of information and data.The complexity of behavioral data increases bias.You could artificially enrich your data set, eg. by drawing conclusions from certain data like income, area of residence, etc.Emotional and temperament data (what value do you hold to your fashion) can be important data to provide for your model, but also as an indication how you would react to recommendations.A small decision in your model or source data can lead to a butterfly effect.In fashion the definition of style is vague. But when you want to quantify certain feature in ML.Fashion trends last around 3 months on SOME, but production time is around 6 months. So how can you react in time?Images of material can be deceiving. Camera angle, light etc can influence how the color is represented.Presentation of textile is hard, because only a limited number of different textiles can be identified by image only.
«MLOps is a set of practices that bring people, process and platform together into a stream-aligned process to manage End-2-End Machine Learning lifecycles.»MLOps is taken about a lot, so I asked an expert what we are actually talking about. Xiaopeng Li is AI business lead at Microsoft for the Western European market, located in Oslo. Xiaopeng is a passionate influencer in the field of Data and AI/ML, who was nominated as AI influencer of the Year at last years DAIR-awards in Stockholm.Here are my key takeaways:Patterns in AI adoptionAI adoption projects are quite diverse, but with some patterns that are visible across. Here are use-cases that a lot of industries are working with:Business Process Automation as an AI use case Adopting AI to process documents automatically and extract key-valuesNatural Language understanding and processing, but also Natural Language generationChat GPTKnowledge MiningUnstructured data analysis«Nordic countries are at the forefront when it comes to adopting AI and ML»Some of the most advanced search capabilities used in Microsoft are developed in NorwayNordic countries are typically quite tach-savvyNordic countries have very good infrastructureWhat is MLOps?MLOps is about agility, productivity, consistency and qualityIt is about creating scalability for your Data Science workMLOPs is a vage concept and you can probably find a variety of different definitions. Is MLOps at the intersection between DevOps, ML and Software Engineering?Scale ML development and deployment with constancy, with quality, with speedThe three elements that are most important are people, process and platformPeople:5 particularly important roles: Stakeholder, Cloud Infrastructure Architect, Data Engineer, Data Scientist, Machine Learning EngineerThere are many different roles involved in MLOps, from cleaning data to testing a model an implementing it. These roles need to be orchestratedDomain experts and stakeholders play a critical role in defining the challenge in the first place. They can formulate what to achieve and what is good enoughChange Management is important, especially if your ML implementation triggers behavioral changePlatform:You are in need of a secure, scalable infrastructure to would your models onMature organizations who do ML at scale, have most an integrated architecture for Data Management, Analytics and Machine LearningProcess:Data collection,. Data processing and data management are processes you need to focus on in MLOpsYou need a process and the right competencies to gather use-cases in the first placeBuild a backlog of initiatives and then go through prioritization based on eg. Data availability, feasibility of solution given current etch-landscape, value for business, cost, time to marked,..Path to MLOpsAlways start with assessing your current landscape and maturityStart by assessing your platform capabilitiesEnsure you have the right competencies and peopleIf you want to operationalize MLOps, don't look at it as a technological problem, but something that includes the entire organizationKey is to bring key stakeholders as early as possible into the discussionOslo AI:https://www.linkedin.com/company/oslo-ai/https://www.meetup.com/oslo-ai/Link to MS learning:MLOps Maturity Model
«The real value in changing the status quo and getting more women to the forefront, is by having women share about the important work that they do, and not just talk about their gender as a topic.»This episode was recorded right before Christmas, when I had the pleasure to chat with Alexandra Gunderson and Sheri Shamlou.Alexandra inspired by a Women in Data dinner in New York, took it upon her to find likeminded people in Norway. That is how she came across Women in Data Science, the conference that was brought to Norway by Heidi Dahl in 2017. First meetup as a Community was June 2018, and this years WiDS event «Crossing the AI Chasm» is coming to Oslo (and digitally) on May 24th, 2023.Here are my key takeaways:Women in Data Science (WiDS)«Creating a meeting place, a place for people to connect and get inspired»Creating a platform and stage for outstanding women.Here are some of the events WiDS organizes:«Champagne Coding» - hands on event«Data after Dark» - after work event: 1-2 quick high level presentations«Data for Good» - get together and solve difficult challenges for greater causesAn important mission is to increase the number of role models in the community to look top to.The goal is to provide arenas to learn together, son it is as important to share stories about failure and collaborate around the learnings from those.WiDS is looking for sponsors, and one benefit can be, that trough events real-life uses cases can be solved.The focus for 2023 is «scalability» how to get unstuck from ML and AI pilots and bring your work to production?The Quest for DiversityDiversity is a complex topic with several perspectives: gender, nationality, background, knowledge, expertise, and experience.Why is diversity important?Leads to more innovative and effective solutionsLeads to more fair and just outcomesThe starting point when working with diversity on a daily basis is awareness.Diversity in the workplaceDiversity doesn't magically happen. You have to work for it.Awareness is a first step, but you also need to collaborate in broader groups.The value is gained when you are able to include everyone in your events and talks.For people to work together against biases of any kind, you need an inclusive culture from the beginning.Be open in your communication and foster a culture of collaboration.The «3rd shift» is an important requisite for women to be able to spend the same amount of time and intellectual capacity at work.The work for an inclusive work environment is never over. We have to continuously work on its and talk about it.Diversity in recruitmentYou have to actively seek out to hire people with different backgrounds.In recruitment, be aware of how you write a job announcement.Use gender neutral language (avoid stuff like «Data Science Ninja» or «Data Rock Star»).There are online applications to check if your language is gender neutral and with suggestions for replacements of bias´ words.You need to highlight more possibilities with a job, like growth and learning opportunities.Minimize the list of requirements in a job-posting.Be aware of you own biases and work with diverse teams also in recruiting.When screening CVs, be aware that different people write in different styles.In an interview process, eg. Women don't like to do coding tests, with someone watching them.Get involved: LinkedIn group, Meetup or https://www.widsoslo.com/.
«Data Mesh promises so much, so of course everyone is talking about it.»I had the pleasure to chat with Karin Håkansson about Data Governance in a Mesh. Karin has worked with Data Governance and Data Mesh and is really active in the Data Mesh Community, speaking at podcasts and moderating the LinkedIn Group «Data Governance - Data Mesh»Here are my key takeaways from our conversation:Data in RetailThe culture in retail is about innovation, experimentation, new products. So governance has to adapt to this environment in order to be successful. If retail would do, what we do in data a fashion retailer would sell yarn instead of t-shirts.Retail knows what the customer wants, before the customer wants it. What would happen if we in data think like retailers?Its more about understanding the business better, then making the business data literate.Data GovernanceData Governance best practices in the DMBOK is still relevant, also in a Data Mesh setting.Data Governance has been on a journey from compliance driven to business value driven.Centralized Data governance creates a bottleneck. Decentralized Governance creates silos. So federated Data Governance is the middle ground.Create incentives to create trust.If you utilize your platform correctly, you can have high expectations towards computational governance.Data MeshData Mesh comes with a cost - you need to invest in Data Mesh.But more than anything Data Mesh implementation is an enormous change effort.If you don't know why Data Mesh, you will implement something elseImplement Data Mesh in an agile way: «start small, fail fast and iterate»To start with Data Mesh, work with a business team that is eager to get started and sees the benefits - «You have to have business onboard, otherwise its not going to work.»Always check if you get the value that you expected.When you do it, make sure you get Governance, business and tech teams to work together and a re aligned on the why.Make sure two upskill for Data Mesh - its is fundamentally different: talk about it, have debates, book clubs, ++The 4 elements of data mesh: Can you implement those in a sequence or should you look at them as a unite to implement within a limited scope? Start finding ways for people to work together (eg. Common goal, and environment where it is ok to share).A good first step is to find an example of data with a certain issue or limitation and talk with the business user about exactly this.Data Governance, as much as Data Mesh, is about change management: You need to get close to the business and collaborate actively.Your first two steps should be:Start with one business unite, an early adopter.Find their most critical data and talk about actual data.MDM and Data MeshAre we still hunting for that golden record? How do we work with MDM in a mesh? This is not solved yet.You can refer to data, instead of collecting data in a MDM system.Maybe the best approach so far are global IDs to track data cross domains, but how you link your data might become the new MDM.«You still need to connect the data, but you don't need to collect the data.» - MDM in a Mesh.Domains, federation and responsibilityIf you federate responsibility to the domains, they also need the resources and competency to fulfill these responsibilities.If the domain data teams are successful in abstracting the complexity, it will become easy to create data products.If you scale too fast (faster than your data platform), you might end up having to duplicate teams.
«If we think of Data Mesh as an evolution of data lakes, knowledge graphs are an evolution of Master Data Management.»Data overload is becoming a real challenge for all types of businesses. With all data that is gathered, both in multiple formats and huge volumes, has created a need for connected, contextualized data. Combined with continuing developments in AI, has resulted in increasing interest in knowledge graphs as a means to generate context-based insights.I had a fantastic chat with Mozhgan Tavakolifard. Mozhgan describes herself as «incubator and alchemist». She worked with PhD-research on trust and SOME. The techniques Mozhgan used to collect data for her research introduced her to Data Science. On this episode of #MetaDAMA, we talked about Knowledge Graph enabled Data Mesh.Here are my key takeaways:Data-driven transformationPersonal transformation and transformation of businesses are quite similar.Very few companies can blame that they are industrializing data and are transformed in a data-driven way.For a transformation to be successful, you need to have a holistic view and invest in a practical manner according to your business case.Don't change your business culture to match a data culture, but rather let data be an enabler for the business.Data MeshEach domain can have a different culture, produce different data products.4 indigents:Give data back to the domains where it is producedCreate self-service data infrastructureFederated governanceData as a productThe centralized data and analytics platform has failed.Knowledge Graph can basically be considered as Data Supply Chain for Data Mesh.KG can be used to semantically link data products.Knowledge GraphsKG enable a human brain-like approach to derive new knowledge. A KG is quite simply any graph of data that accumulates and conveys knowledge of the real world.Every consumer-facing digital brand, such as Google, Amazon, Facebook, Spotify, etc., has invested significantly in building knowledge and the concept of graphs has evolved to underpin everything from critical infrastructure to supply chains and policing. There is a difference between Knowledge graph and graph data store. The semantic layer is what makes data smart.KG is when you have a dynamic and rich context around knowledge.KG can be used to semantically search for data.KG are a very important part of data mesh.If you want to start with knowledge graphs: find your business case. What is the purpose for you?«RIP Semantic Web! The Semantic Web is dead.»Maybe part of the problem was that is was academy focused, more than practical, industry-focused.If we manage to implement Data Mesh on a society level, we might have taken a big step realizing some parts of the semantic web.TrustTrust and explainability is based on context. Knowledge graphs can provide context and connect between them, this can ultimately generate trust.
«How can you use AI algorithms to make your city more sustainable?»How can we use AI to work more sustainable and optimize our operations for less pollution and more efficiency? I talked with Umair, Head of Data Science, Data Warehouse and Artificial Intelligence at Ruter AS, the public transport authority in the Oslo region, the Norwegian capital. Umair is also a associate professor at OsloMet, teaching AI to bachelor level student, as well as founder and CTO of Bineric Crowdsourcing and founder of the volunteer organization Offentlig AI.Public Transportation;:«Public transportation is complex, because data is not coming in from many different internal and external sources.«A bus can have minimum 20 sensors.» All theses sensors are sending realtime data.A huge amount of data is collected through external sources.Ruter is going away from centralized Data Management teams to more of a mesh approach.Data Mesh will give you a more complex data function, with need for more people and more organization and coordination.One team cannot have the full ownership for the entire data-driven prerogative of a company.2 factors that helped Ruter succeed with Data and AI:E2E responsibility for the whole AI algorithmCreate in Production and don't overdue PoCsSustainability & AICapacity predictionPredict capacity 3 days in advance, but possible up to 1 month in advanceThis gives passages the possibility to plan their trips betterAn operations team that sees a pick in traffic realtime, send additional busses to ensure enough capacityThrough the prediction algorithm, fleet capacity can be reduced and it is easy to plan for balancing the load beforehandFleet managementThe long term vision for Ruter is to work more with Order services. In the future you shouldn't have to walk to a bus stop, but can order a bus to your home.The existing solution is for seniors (67+) and is tested in the Viken areaBut how can you ensure that the busses are close to an eventual future order?Ruter is training an algorithm to predict where orders might come from to ensure a bus is parked close by. This results in less driving and less emissions.To train the algorithm Ruter uses mainly historical information, but combined with e.g. weather informationAnalysis of customer feedbackSentiment analysis to see how happy/unsatisfied a customer isExplainable AI«AI is just statistics on steroids!»Its hard to explain how a probability output is achievedThat makes an AI algorithm a black boxDevelopers create a set of tools and applications which can give insight on different factors how a decision was achieved by an AI algorithmQuantum ComputingTraditional computing is expensive and needs more time and resources to reach a specific outputVolumes of data are constantly increasingThis was a research project together with OsloMetQuantum computing was cheaper to work with then traditional computingQuantum Computers are more sustainable and energy efficient
«Its not just about the use of data, but the use of data in a cross-functional setting.»What a fantastic conversation with Nina Walberg. Nina has been with Oda since 2019 and has a background in Optimization and SCM from NTNU.Oda strive to create a society where people have more space for life. Make life as hassle-free as possible. And to achievetis with help from data, Oda has created its 6 principles for how they create value with data.Here are my key takeaways:Business model and use of dataOdas business model allows for a better and more cautious way of thinking sustainability. The quantum of products can be tailored to the actual need.Climate recipe: Oda provides data on the climate footprint to its customers, when ordering products.Use data to provide not just inspiration to your customers, but also help them to create a complete basket of groceries for the week, to avoid any additional trips to the super market.The use of data needs to be combined with all functions, also business functions. All should be part of the development process6 Principles6 principles recently updated, but originally created by the entire team in 2019.Oda believes in autonomous teams and that trust and responsibility is given to these team.Domain knowledge and discipline expertise70% of Data and Insight people are embedded in cross-functional teams.Data is connected across the company and across domains. So you can not work exclusively in an embedded model. You need some central functionality.Data Maturity differs between domains. So really the embedded model depends on the circumstances and how Oda applies that.Data Mesh has been an inspiration.Distributed data ownership, shared data governanceProcesses and parts of the product for Oda are developed by the domain teams. To federate the ownership is a natural move.Domain boundaries need to be explicit. «Every model that we have in dbt is tagged with a team.»Ownership of certain products is harder or sometimes not right to distribute. It can either be core products with no natural domain team to own, and that many are dependent on, or the core data platform itself.Customer data first, but without proper product data you can't live up to thatData as a productMake sure you don't just deliver a product, but that it meets the customer need, not just the customer demand.Consistency matters, structure matters, naming matters when it comes to data products.Enablement over handoversEnabling others to do what they should be able to do themselves.Oda has established a segmentation model for five levels of self-service with different expectations to different user groups.Self-service needs to be tailored to different roles, maturity, and needs of the internal users.Data University, Data Hours and many other initiatives help to create a learning culture and improve data literacy.Impact through exploration and experimentationIt is important to test and see how a solution actually provides value to the expectations.This provides insights and information you can act on.Proactive attitude towards privacy and data ethicsData ethics needs to be incorporated and can't be an afterthought.Company values can and should be directly linked to the work with ethics.This episode was recorded in September 2022. Click here if you what to know more.
«If you do guidance correctly, people will follow it. People want to do the correct thing. Nobody wants to do things wrong.»From fisherman on Island to Data MVP in Copenhagen! Asgeir has been through a fantastic journey and we had a great conversation around PowerBI Governance. Asgeir has his own blog about the topic.Here are my key takeaways:Think of platforms at a translation: You are own a journey and platform is where you stand on, from where you push off, start your journey.A lot of organizations are only on maturity level 1 when it comes to Power BI, even though they think they range higher, or even if there are different levels of maturity in different parts of the organization.Buzzwords help shape opinions, and to have discussions. With good help, these opinions can be shaped into actions.If you have the possibility to do a green field approach to data mesh - consider it: It will not get easier than this.The parts of Data Governance, that can be solved with a focus on technology, and process are much more mature and easier to handle than the softer parts, that are concerned with people.It is easier to lock out intentional mistakes than unintentional.Governance is changing focus, from being compliance driven to providing a framework of how we can get more value from our data, it's about making people productive.By making people more productive and efficient, you can pay off the cost of governance really fast.If you use data more proactively, you are moving the power from people to the machine. But there needs to be a balance, its not either or.PoweBIIn PowerBI implementations we should have talked Governance from the start.PowerBI enables people in your organization to use data on their own. There is always value ion that.To use self-service tools correctly, you need to either have people formally trained or/and have guidelines in place BEFORE they start doing things.Not everyone in your organization will become a PowerBI expert or is a data person. Don't expect everyone to go there.Governance is about giving people a framework and a good chance to do things correctly from the beginning.Most low hanging fruit: Build a report inventory!Try to keep the usage of PowerBI to the M365 environment and meet people where they are.Know your requirements before you start using tools.The 5 pillars cover all of what your governance strategy implementation should cover:The people pillar is about having the right roles in place and train people to perform in these roles.Because PowerBI is a self-service tool the administrator role is often allocated randomly.The processes and framework pillar is about what proper documents in place to make it work so users can use Power BI correctly and be compliant. This covers administrative documents as well as end user documents.The training and support pillar is about making sure everyone that uses Power BI has gotten the required training. This is also about deciding what kind of support mechanisms you need to have in place.The monitoring pillar is about setting up monitoring of Power BI. Usually, it involves extracting data from the Power BI activity log as well as the Power BI REST APIs for information about existing artifacts in your Power BI tenant. The settings and external tools pillar is about making sure Power BI settings are correctly sat as well as how to use other approved tools to support Power BI (extensions to Power BI).