#44 A Pragmatic Approach to Getting Started with Data Mesh at Northern Trust – Interview w/ Khanh Chau

Provided as a free resource by DataStax AstraDB

In this episode, Scott interviewed Khanh Chau, Lead Architect for the Data Mesh Initiative at Northern Trust.

Khanh believes you have to be passionate about making data better to do a good job implementing data mesh. And it is DEFINITELY a journey so you need patience and vision. Also, each journey is unique, you can’t just copy/paste from another organization. You need to make failure okay – but you should look to make it easy to fail fast, measure, and adjust.

Khanh talked about the need for exec buy-in before heading down the data mesh path. They got that exec buy-in by proving that the total cost of ownership of data was quite high as the consumers had to do a LOT of work to get the data to usable.

When speaking internally, the business people were very excited to participate if it meant they could get quality data. Some of the IT/data engineering folks were harder to convince. It was especially hard to get them to shed layers of not-useful technology.

Some IT teams were easier to convince – they had felt the impact of a few too many middle-of-the-night data downtime incidents. Other teams hadn’t felt that pain so there were harder to win over. There was also the incentive of additional possibilities – data mesh meant they could do things they couldn’t do before.

Khanh talked about making the platform the easy and right path for 80% of use cases. They focused on making things easy to configure; basically: what transformations do you want to do and then it automatically provisions the pipelines. Their goal was to make it easy to make good progress quickly; their time to initial deploy went from 2-3 months per data service to 2-3 weeks per data product and they hope to drive it down further.

Northern Trust has been moving forward with data mesh for about 7 months as part of their high-level digital transformation initiative. On the data side, they had previously focused on data virtualization and data federation but it was not delivering the results they wanted. It was not as scalable as they wanted – it was taking 2-3 months to launch each new data service. They also did not have great information on who was consuming the data and why.

For their data mesh proof of concept, Khanh and team set a timeline of 9 weeks. They needed to prove value by then or data mesh would be a very tough sell internally. Khanh talked about the need to sell data mesh as a paradigm shift in order to get people out of technology-focused thinking.

Northern Trust decided to take a pragmatic approach e.g. not pushing all aspects of data ownership fully left. Khanh and team were focused on finding a “happy balance” on data product SLAs and quality – improvement was necessary but the team preferred done to perfect.

A big focus and a key driver for Northern Trust has been building muscle and learning/evolving along the way. It’s important to evolve quickly and not build muscle in the wrong way.

Northern Trust is still in the early days on figuring out interoperability between data products. It’s more of an art than a science. Khanh believes bi-temporality is more important right now than interoperability.

There are a lot of great learnings to takeaway from the Northern Trust journey.

Khanh’s LinkedIn: https://www.linkedin.com/in/khanhnchau/

Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him at community at datameshlearning.com or on LinkedIn: https://www.linkedin.com/in/scotthirleman/

If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/

If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see here

All music used this episode created by Lesfm (intro includes slight edits by Scott Hirleman): https://pixabay.com/users/lesfm-22579021/

Data Mesh Radio is brought to you as a community resource by DataStax. Check out their high-scale, multi-region database offering (w/ lots of great APIs) and use code DAAP500 for a free $500 credit (apply under “add payment”): AstraDB

Leave a Reply

Your email address will not be published.