#79 A Data Success Secret Recipe: Comfort with Ambiguity and Change Management – Interview w/ Vincent Koc

Data Mesh Radio Patreon – get access to interviews well before they are released

Episode list and links to all available episode transcripts (most interviews from #32 on) here

Provided as a free resource by DataStax AstraDB

Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here

In this episode, Scott interviewed Vincent Koc, Head of Data at the merchant platform company hipages.

To start with some big takeaways from Vincent:

  • If you aren’t comfortable with an agile mindset and ambiguity, bleeding edge probably isn’t for you – and that’s okay!
  • You and your organization need to be comfortable with failing, learning, and then iterating
  • To get data mesh – or really any big change initiative in data – right you should focus on change management much more than you probably think
  • It ends up being the secret sauce or crucial lacking factor much more often than the tech
  • Think problem-specific, not technology specific
  • It’s easy to over-engineer the problem – technologists want to technology
  • In general, consistency is key to achieving widespread success in data
  • One domain having a major success won’t lead to broader org-wide success if you don’t look to leverage reusability factors to make consistency across other domains easy – a bunch of great but non-consistent solutions doesn’t add up to a valuable whole picture

For Vincent, every organization considering data mesh should ask if it is really the correct approach for them. Data mesh really isn’t for a large subset of organizations, whether that is right now or even ever. If your organization doesn’t have an appetite for change, it’s going to be very tough to move towards data mesh. If you want to implement data mesh, he recommends embracing an agile methodology e.g. fast feedback and trial and error.

When thinking about splitting your data monolith into domains, Vincent recommends taking a lot of learnings from what works well in the microservices realm. You shouldn’t decompose everything all at once – that just creates chaos. You can split out larger domains one by one and then figure out if you need to split them further when there is more value in doing so. Peel them off instead of a big bang approach.

Vincent believes that, in general, ~20% of your teams will consume ~80% of your data team’s time and energy. There are a few ways to work with those teams to reduce that but it is also somewhat a fact of reality. Whether that is because those domains are more prominent, noisy, well loved, or for many other reasons. That data work disparity often leads to those areas being more data mature.

When discussing disparate data maturity, Vincent talked about the need to drive all domains that will participate in something like data mesh to at least a common base level of maturity. You have to have a relatively mature domain to be at what he referred to as “mesh-level capability”. Domains that aren’t at that capability will still need to rely more heavily on centralized data teams and capabilities as they improve their data maturity. And it’s okay to work with them closely to up their maturity level – just telling them to catch up is probably not going to work, there will need to be a bit of hand-holding.

Vincent believes embedding data analysts – whether you call them data analysts, analytics engineers, or something else – into domains is crucial, especially if you are going to attempt to implement data mesh. They serve as the custodians of the data for the domain, whether that is data shared with others via a data product like in data mesh or for data the domain regularly consumes and uses from the domain and external sources. One point Vincent thinks is crucial to moving forward in data mesh is that those data custodians need to help disseminate data knowledge to software engineers and the organization needs to build tools and frameworks to make it easier for software engineers to own and manage data.

When asked about the balance between long and short-term planning, Vincent talked about with an agile methodology, you need to “cater to today” and not get overly focused on the long-term exact plan/roadmap. Things will change. Set yourself on a good path, set that North Star, and keep your ears and eyes open for the signals you need to change your plan. E.g. look at what a domain could become with the right direction and set it on that path to positive evolution as best as possible. But don’t overly define the path.

Vincent talked about how crucial – and often overlooked or ignored – good change management is for data organizations. Now more than ever due to how fast the world and the data landscape is changing. It’s crucial to break down changes into terms and/or actions that all constituents can understand. The organizational is much more important than the technology in most respects.

To drive buy-in , Vincent has seen giving people agency over their data work well. Giving domains the trust AND the tools/frameworks/resources to manage their data gets those business leaders/domain owners to come to the table quite often. But you can’t just give them the responsibility without the additional help.

Vincent discussed some past approaches that he would do differently now. He focused too much on the technology and telling others exactly how something should work or look instead of working with them to drive to the outcome and let them figure out the right path – while helping them along the way. Give them the big picture and talk to the outcome rather than the technology.

To do that, Vincent talked about building out a pattern but not the whole picture; give them a defined enough idea of a good outcome but not overly defined. And build in the easy path or golden path boiler plates where it makes sense through templates but with extensibility – if things are too rigid, people will thrash against that. It is quite easy to over-engineer solutions – think problem-specific, not technology-specific.

Specific to data mesh, Vincent sees one of the big remaining questions out there is how to actually automate and decentralize governance, especial things like access control. There isn’t much good specifics on how to set up security as code in a scalable way. And we need to think about security as a sliding scale of risk.

Vincent shared a few words of wisdom near the conclusion for data leaders. The first was how easy it is to see something like data mesh, get very excited, and try to do a whole lot of changes at once. Making mass changes causes instability – instead, think about where you can be more targeted in support of the long-term big picture. Don’t boil the ocean. The second was get comfortable with ambiguity and think if your organization is aligned to accepting ambiguity – if you aren’t comfortable with ambiguity, stay away from bleeding edge. It’s called bleeding for a reason. You need the ability to try/test, fail, learn, and then iterate towards a better solution. If failure isn’t allowed and maybe even celebrated, the bleeding edge is probably for braver souls.

Vincent’s LinkedIn: https://www.linkedin.com/in/koconder/

Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him at community at datameshlearning.com or on LinkedIn: https://www.linkedin.com/in/scotthirleman/

If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/

If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see here

All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, and/or nevesf

Data Mesh Radio is brought to you as a community resource by DataStax. Check out their high-scale, multi-region database offering (w/ lots of great APIs) and use code DAAP500 for a free $500 credit (apply under “add payment”): AstraDB

1 thought on “#79 A Data Success Secret Recipe: Comfort with Ambiguity and Change Management – Interview w/ Vincent Koc”

  1. Great episode with really important topic: change management. Without solid change management practises in place, no implementation – transformation, really – of Data Mesh magniture can hope to succeed. A couple of complementary perspectives though:

    1) Change management needs to build on strong foundation of WHY. Not only at the start of the change program but integrated to continuous communication. WHY cannot be found *within* data capabilities – be they about architecture, technology or organization. WHY needs to be rooted on business strategy and more importantly on business environment: How data, analytics and AI are going to help us to compete. And why Data Mesh is the ultimate long-term solution for just that.

    2) Because of the above, WHY NOT does not come from within data capabilities: “Because it is so difficult”, “Because we don’t know how to be Agile”, “Because we don’t master DDD”. That would be fatalistic. Empathy is essential part of change management but too much in wrong place leads to demise of the whole organization – given the forces in competitive landcape. Not much empathy in that. Change management for Data Mesh implementation needs to be integral part of strategic management.

Leave a Reply

Your email address will not be published.