#226 Learnings From Implementing Data Mesh at a Large Healthcare Company – Interview w/ Mike Alvarez

Sign up for Data Mesh Understanding’s free roundtable and introduction programs here: https://landing.datameshunderstanding.com/

Please Rate and Review us on your podcast app of choice!

If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see here

Episode list and links to all available episode transcripts here.

Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.

Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.

Mike’s LinkedIn: https://www.linkedin.com/in/2mikealvarez/

In this episode, Scott interviewed Mike Alvarez, Former VP of Digital Services leading the data mesh implementation at a large healthcare distribution company. He’s now working on his own startup.

Some key takeaways/thoughts from Mike’s point of view:

  1. Lean in to the new value-creating possibilities that can come from empowering thousands of your colleagues to leverage data.
  2. As an industry, we have to learn to do data work in an incremental fashion. It’s not been the norm and it can break people’s perception of data work but it’s crucial to get where we want to go.
  3. You can drive data mesh buy-in from domains by showing them the freedom they will have. Autonomy, empowerment, going at their own speed, etc. can get many to lean in.
  4. Advice to past data mesh self: Early in your journey, you can share your vision until the cows come home and people will say they understand – and probably think they understand – but it’s incredibly easy to get misaligned. Really focus on what you are trying to achieve. What are the target outcomes?
  5. Similarly, it will be harder than you expect to drive buy-in. Many people say that but it’s still going to probably be harder than you expect after hearing that 🙂
  6. We need to move away from old approaches to data for large companies because the sheer scale of initiatives ends up creating bloat and risk factors unto themselves. Small and nimble gives us much quicker time to value delivery and builds to much greater outcomes.
  7. Shadow IT develops to try to move at the speed of business for domains. But it’s rarely scalable or robust enough to even support the domain in the long-run and it certainly isn’t built to integrate well with the rest of the organization. Try not to hold past shadow IT decisions against domains.
  8. Most teams – especially pre data mesh – don’t truly understand the data they are ingesting. It’s on consumers to get more information but if the producers aren’t helping them, teams will ingest what they can even if they don’t fully understand it. Data they don’t understand well drives value but could be driving so much more value.
  9. Start from the problem first: what am I trying to solve? Do I need a new approach or can I use something I already have? Don’t reinvent the wheel but we might just have to reinvent doing data at scale a la data mesh.
  10. Collect stories of past attempts internally with negative outcomes. What were the common reasons, the common patterns for things failing or not delivering expected value? They are useful for perspective and to drive buy-in.
  11. Treating data as a product makes more and more sense the deeper you dig into it. But just doing data as a product can’t survive on its own as an approach to doing data.
  12. When trying to share information about data mesh, it’s not like everyone will instantaneously understand or be on board. It will likely take a while in most organizations to build up the momentum to even consider starting on a data mesh journey. Have patience.
  13. Data mesh really enables teams closest to the customer, closest to the day-to-day business, to drive more value through data. It allows them to react much more quickly as the world evolves and focus on the problems of the customer.
  14. ?Controversial?: The operating model change with data mesh is what drives the real value. And lots of domains can get bought in that they get to own their own destiny but be empowered to manage their data like a product instead in non-scalable and quickly deteriorating ways. Scott note: I think we do need better tech to fully leverage the potential value of data mesh but right now, I agree that most of the value is driven by operating model changes.
  15. A shared vision of what you are trying to achieve is important. It lets people rally around something and start to build a community, which is crucial to delivering on a data mesh approach.
  16. ?Controversial?: Don’t try to force your domains, your lines of business to leverage your centralized tooling and comply with optional governance (there is non-optional governance of course). In exchange for those who leverage central tooling, pay them back via automating away toil where possible. Community is about give-get. Be a good member of the community.
  17. The three crucial dimensions of a product: viability, feasibility, and desirability. When adopting product thinking, you should think about does your product satisfy all three factors.
  18. You need to communicate when something isn’t feasible. Too often in data, people have just said no instead of ‘no, and here is why…’ Let people in on your thinking and prioritization process around what work to do when.
  19. Good product management skills are necessary to understand data as a product – to transition us from creating and sharing data sets to sharing high value information exchanged via a data product. You need to delve into and understand the domain to figure out what would be most useful to share via a data product.
  20. !Controversial!: It might be time to completely let go of the concept of “single source of truth.” We’ve been chasing it in data for so long but the cost/benefit is starting to look it doesn’t make sense. What are we trying to achieve – perfect data or a strong understanding of the world and how it’s changing? Scott note: strongly agree and so does Zhamak.
  21. New, more correct information about aspects of the business is not always welcome. Unfortunately, you might have pushback if you attempt to tackle a problem that changes people’s view of their business. So choose use cases, especially early, well 🙂

Mike started off with the general need for large companies to change their approach to analytics at scale. We’ve been doing a lot of the same things for the last 30 years and they aren’t quick enough to respond to changing business needs – 6+ months and $1M+ to get to your first query just doesn’t make sense anymore – did it ever? And we can do better now. The business side of companies shouldn’t have to wait for data and see the world change well before a solution is delivered. We need to move at the speed of business.

Regarding shadow IT, despite leading a central data/IT organization, Mike doesn’t hold it against domains. The lines of business can’t deal with the bottlenecks of going through a central team and try to build things themselves. However, it’s rarely all that scalable and certainly isn’t built with sharing to the rest of the organization in mind. Shadow IT just isn’t built with a product mindset so it becomes brittle and dilapidated quickly. So the central team is a bottleneck but the decentralized approaches don’t scale. Add in the teams generally not really truly understanding the data they are ingesting or even often producing and it’s a recipe for data underperforming expectations. Of course, this is what Zhamak identified and why she created data mesh.

Mike talked about when considering a new approach to data, he didn’t want to do data mesh for the sake of it. What was the problem they wanted to solve? Could an existing approach or platform do what was necessary? What were the organization’s past failure modes or times when things didn’t meet expectations and what were the common through-lines or patterns? And then he took those past unmet expectations and used them for understanding as well as driving buy-in. The definition of insanity is trying the same thing over and over and expecting different results. So if data projects were constantly not meeting expectations, shouldn’t we change the way we approach data? And treating data as a product seemed like a great start, which led to selecting data mesh 🙂

While data mesh can feel like the right call to some immediately, it’s not likely to be the universal reaction at any organization. Mike and team spent a number of months driving to how this could work and building up the buy-in and momentum to even start on their data mesh journey. This isn’t an overnight approach, you really need to think deeply about how it could work and – back to those potential failure modes – how it could go wrong so you can prevent heading down bad paths as best as possible.

But what really drove Mike’s interest in data mesh as a possible solution was how it could enable the teams closest to the customer to react to customer and market needs, especially changes in customer demands/wants/challenges. It is about empowering the teams to move at the necessary pace to stay ahead of the competition instead of waiting for a centralized team to give them access to leverage their own data or the data of teams close to them in the organization.

For Mike, the value of data mesh isn’t about the technology shifts, at least not yet. It’s about the operating model shift – giving domains the capabilities and empowerment to handle data. We are trusting them to own their data and giving them the ability to do so in a scalable way. We are giving them the ability to react in a much quicker and more meaningful way. All of these can get people leaning in to doing data mesh. But they don’t care whether it’s data mesh or any other paradigm. That’s where data people need to connect the dots for them, how can this work and what benefit does it have for the domain. And what are the actual changes for them?

To get the most out of data mesh, Mike believes you have to have a strong vision of what are you actually trying to achieve. It’s not an approach to take on lightly. You need to really think about aligning everyone around that shared vision and build as a community effort. How do you take the principles and new approaches and focus on delivering business value – for their own domain and the broader organization too?

Mike believes a big part of doing data mesh is kind of the social contract around enablement and empowerment. Sure, teams can go off in their own direction but if they give up some of their autonomy to stick to the centrally provided tooling – which makes governance far easier -, you need to give them something in return. In their case, Mike and team gave the gift of automating away a lot of the toil work 😀

On advice to his past data mesh self, Mike talked about early in a mesh journey, people believe they are aligned on vision but they probably aren’t. Holding all of data mesh as a concept and then contextualizing it to your specific organization is a massive amount of work and cognitive load. Trying to get someone to fully understand that upfront without seeing the progress, you will almost certainly have some misalignment and misunderstandings. Instead of the specifics, focus on the target outcomes, what are you trying to achieve? If people align on the benefits, you are more likely to gain and retain momentum. And it will take a lot of effort to get most people committed to the vision, just be prepared for that.

Mike talked about the three key aspects of a product: viability, feasibility, and desirability. Feasibility it a crucial aspect to consider in data – especially data as a product thinking – because often, something just isn’t likely to work for a number of reasons. And when there is the desirability but not feasibility, you really need to communicate why it’s not going to happen. With data mesh, there can be a misconception that the switch has been flipped and we can do any data work we can think of – and that we should! But that prioritization process and understanding – and then communicating – what is the current art of the possible is important. Always be communicating about what you are doing when and why.

Domain understanding is crucial to really understanding data as a product for that domain in Mike’s view. How do we move from trying to serve data sets as if that is the product to creating the information that will be most useful to consumers about the domain in a productized way? And then iterating towards more and more value as you improve the data product or suite of data products representing the domain. Easier said than done of course.

Mike asked the provocative question of do we still want to seek the fabled “single source of truth.” It can be a bit like the dog chasing its tail – when you catch the tail, then what? Are we trying to perfectly clean data or are we trying to drive value from data? Is the juice worth the squeeze or can we drive better value – and especially nimbleness – by taking a slightly different view? Scott note: Zhamak urges people to consider “the most relevant source of truth” because there are multiple perspectives on the same things that can all have value, you have to decide what is best.

Mike warned that some use cases are politically untenable or even toxic. Especially early in your journey, consider will participants actually want to know the information. Yes, in the abstract, we want everyone to be perfectly data driven but humans aren’t and won’t ever be. Don’t ignore that and tackle something that will be more hassle than it’s worth.

In wrapping up, Mike had two points. The first is learn to work incrementally. That has been somewhat of the antithesis to how data work has historically been done but it’s incredibly important. The second point is to really lean into empowerment and the art of the possible. We don’t really know what might happen when we empower thousands of our colleagues to be better able to leverage data. Be excited and open to the journey of finding out what value they create.

Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/

If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/

If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see here

All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

Leave a Reply

Your email address will not be published. Required fields are marked *