#107 Focusing on Outcomes and Building Brave Teams in Data – Interview w/ Gretchen Moran

Data Mesh Radio Patreon – get access to interviews well before they are released

Episode list and links to all available episode transcripts (most interviews from #32 on) here

Provided as a free resource by DataStax AstraDB; George Trujillo’s contact info: email (george.trujillo@datastax.com) and LinkedIn

Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here

In this episode, Scott interviewed Gretchen Moran, the Senior Director, Data Products at the National Geographic Society (NGS; the non-profit arm of National Geographic).

Some key takeaways/thoughts from Gretchen’s point of view:

  1. NGS is a bit unique in that they don’t have a widely deployed data architecture so they do not have a lot of habits to unlearn. Starting with a greenfield means likely more training and learning/experimenting will be required but at least no institutional unlearning.
  2. To move forward with data mesh, organizations must be able to embrace change – and the pain that it will inevitably bring – and embrace ambiguity. You need to move forward and figure it out together but also be okay with failure as a learning experience as you test what works for your organization.
  3. To win the hearts and minds of data producers, show them what high-quality data can mean for the organization and their domain/role. Work closely with them, understand their context, hold their hand to bring them along and align them to the vision of data mesh.
  4. It’s easier to drive buy-in widely if you find the organizational influencers and win them over. It is the domino effect in practice. Partner closely with the influencers early on to drive your initiative forward.
  5. For NGS, they are working with a single initial data producing team for their proof of value. The data mesh world seems to be split a bit between working with one or two to three teams in the initial proof of value stage.
  6. “Any technology effort is still a people effort.”
  7. We have yet to learn how to leverage the knowledge and context of people without data knowledge in general in the data and analytics space. This is what data mesh tries to unlock but we are still figuring out how to do it well.
  8. It’s very easy to intimidate people with data. We need to make tech and especially data much less intimidating to push broader adoption. The business context of those who aren’t yet data literate can be extremely valuable. We need to lower the actual bar to leveraging data but also lower the perceived bar to leveraging data.
  9. “Metrics + outcomes = value” – without outcomes attached, metrics have no value.
  10. Automation is going to be key to many aspects of data mesh. Upskilling people to leverage data will only really pay off if it doesn’t mean a large increase in the amount of work to leverage data.
  11. User experience is crucial to getting the most value out of your data. Think about your data user experience (DUX) and bring in designers to help optimize the experience and really focus on data as a product thinking.
  12. NGS is still trying to find who should own generating and sharing insights on data combined from multiple domains. Is that a centralized insights team? Does that push us too far back towards centralization? It’s still early days but those insights are crucial to driving value from data.
  13. We will see where new insights come from in data mesh. Will it be more insights from data consumers as they can spend time on data analysis instead of data cleaning? Or will it be the data producers as they learn how to really leverage their own data? Both?
  14. Many are waiting for vendors to validate data mesh but they really haven’t done that very well yet. It remains to be seen if they even could validate something so complex and large in scope.
  15. Building brave teams – teams that aren’t afraid of new challenges or of failure or especially of ambiguity – will be crucial to getting data mesh right. Teams might be afraid of doing things “wrong” but as long as they work to get to a good state and incorporate feedback and learnings, that’s what will drive much more value in the long-run.
  16. There is really shared ownership of data even inside the same domain. The subject matter experts and the people shaping the data products must build a strong relationship with good communication.
  17. Data mesh aims to – and needs to – solve for the cost of change in data. Traditional data warehouses have had an extremely high cost of change. The huge cascading pipeline setups most have with data lake too. Data mesh needs to make evolution in data much easier, quicker, safer, less costly, etc. It’s still early days there.
  18. There is a delicate balance between over architecting and underinvesting in your platform. Look to build for reuse and don’t lock yourself into decisions where possible. Far easier said than done.
  19. Many teams are worrying if they are doing data sharing wrong. But can they actually really do it wrong? Yes, probably, but if they are open to feedback and paying attention, they don’t have to get it “right” the first time. You can evolve to get to a very good place but prior data setups have been so rigid, that evolution has been tough and costly.
  20. You probably don’t need to build out as large of a team as you might think to start on your data mesh journey – depending on your timeline. It will take years to get to delivering fully on your vision but you can add a lot of value as you progress and learn – you don’t have to get it perfect at the start! And consider what skillsets are really crucial.

Gretchen started off by giving her background and some of the ways her history has played into her perspective and current role. A big factor in her interest in data mesh was helping a number of large organizations evolve their data platforms and how that helped those organizations deliver better results – but they still often struggled somewhat to derive full value from their data and data mesh can hopefully unlock that value.

For the National Geographic Society relative to data mesh, preparation has been Gretchen and team’s keyword. Rather than trying to move forward with their data mesh implementation as fast as possible, they’ve spent the last year testing and preparing for implementing their data strategy. And they are in a bit of a unique situation because even though the organization is over 100 years old, really their tech stack is about seven years old and they don’t have a cohesive data architecture deployed. This means they also don’t have a lot to unlearn but have a ton to learn and experiment on.

NGS is already organized in a product-centric approach and the technologists really understand their domains. Now, Gretchen and team just need to get them bought in that they should treat their data like they do their applications – like a product – as they move forward with their data mesh implementation. Easier said than done but the organization in general hasn’t been pushing back on these ideas, which has meant good initial collaboration.

While embracing sharing data is crucial to NGS’ overall organization-wide strategy, it’s not all sunshine and rainbows as Gretchen knows there will be a lot of heavy lifting. Heavy lifting around change, heavy lifting in going against the status quo. To be successful with a data mesh implementation, the organization is going to have to embrace change and ambiguity. And both are typically painful.

Gretchen and team knew to lay the groundwork for something like data mesh, their organization needed a base layer of data literacy. Without an understanding of data, would they even have people to consume data, much less people capable and willing to produce their data like a product? So they started by bringing in consultants to help people start learning and to build a general business glossary.

But to really reach the “hearts and minds” of the general organization, Gretchen knew they needed to show people what value data can bring them. What are their goals and needs and how can data support that? How is something like having high-quality data available valuable to data consumers but more tricky, how is it also valuable to data producers? And part of their data literacy/upskilling process was showing people what using data could mean for them, not just a training course in SQL or Tableau. Just training people how to use a piece of technology in a vacuum hasn’t worked well. A success vector for Gretchen has been finding the organizational “influencers” that provide the leverage to drive buy-in across the org.

So, how are Gretchen and team getting going after their preparation period? They are partnering very closely with an initial pilot team and are going to prove out the value to share more broadly across the organization. This has been an interesting question in data mesh – how many domains and/or data products should be involved in your proof of value?

Broadly speaking, Gretchen believes – and Scott agreed – any technology effort is still very much a people effort. It is very hard to do something like make data self-evident so the people need to steer and steward any technology effort to do something like that forward. Then add in the fact that data mesh is much more organizational/process focused and the people side becomes even more crucial.

Gretchen talked about metrics in general and her theory of the “bad metrics sin” – that it is worse to have bad metrics than no metrics at all. And to identify early and then stay away from vanity metrics. She strongly believes that metrics + outcomes = value so without the outcomes, metrics don’t have value. As Sadie Martin and Katie Bauer mentioned in their episodes, measure what matters and measure what you will act on. And measuring impact in the NGO (non-governmental organization; a term for many non-profits) space is particularly difficult – Gretchen used the word persnickety – so really finding your useful metrics and backing them up can be a challenge but is crucial.

One behavioral change Gretchen is pushing heavily as people are learning more and more about data – asking about people’s rationales when making a choice working with data. The outcome is more context for all involved because people use their context to make choices so learning why they made choices can highlight very interesting points. Why did they go with X versus Y? It’s crucial to do this to enhance curiosity and learning rather than asking people to prove their reasoning/understanding. So ask with the tone and goal of “tell me more”.

And it’s easy and quite common to intimidate people with data, per Gretchen. We need to lower the actual bar to leveraging data but even moreso, we need to lower the perceived bar of how challenging it is to leverage data. Part of doing that is meeting people where they are, showing them how they can leverage their current knowledge and skills while upskilling them to be even more effective.

Gretchen is seeing people in NGS going through so many hoops to produce reports and data in very manual ways; so enabling them to do produce and consume data automatically and more reliably is something she’s excited to take on. That way, NGS can leverage their knowledge and skills without the manual effort – allowing them to focus on the value-add aspect of working with data, the insights and how to act on them.

User experience (UX) is really crucial to everything NGS does, per Gretchen. Their product managers spend a lot of time to really understand the business aspect of things, not just the software pieces. So they now need to learn how to do the same with data. Product thinking is crucial to getting data mesh right, not just creating data products. How can we move to sharing actual insights instead of just data? And especially, who owns creating and sharing insights on data combined from multiple domains?

For Gretchen, it will also be interesting to see what additional insights can be generated when we focus on keeping data clean from the start, not just cleaning it up after the fact by data consumers. What additional insights might come from people actively monitoring the collection and processing of information? Who will generate the new insights? Will it be the traditional data consumers who can now spend the time to work with the data instead of clean it? Or will more insights flow from the data producers as they really get their arms around their own data? The answer is probably both.

Building brave teams – teams that aren’t afraid of new challenges or of failure or especially of ambiguity – will be crucial to getting data mesh right in Gretchen’s view. People have to welcome change and understand that while change is painful, there is a point and purpose for it. Give them the understanding of what the change is for, what is the reasoning.

Gretchen and team are trying to ensure they aren’t over architecting the data platform, putting in too much work too early and locking themselves into choices if there isn’t a need. But then it is quite easy to underinvest and not provide what people actually need. So she’s really focused on making the platform robust enough but not too rigid or expensive. It’s a hard needle to thread.

Many teams are worrying if they are doing data sharing wrong in NGS, per Gretchen. But can they actually really do it wrong? Yeah probably, but if they are open to feedback and paying attention, they don’t have to get it “right” the first time to get it right eventually. You can evolve to get to a very good place – prior data setups have been so rigid where change has been extremely painful so that evolution has been tough. Data mesh needs to solve for lowering the cost and fear of change in data but it’s still early days.

Gretchen doesn’t think you need to build out a huge team to do data mesh, or at least to get moving. Her team’s approach is to build a reusable base for generating and managing mesh data products and have a few data architects to keep moving things in the right direction. Then, they have the team and the drive to teach developers how to manage data as a product and get them bought in that it’s necessary to do so.

Some rapid-fire insights from Gretchen to wrap up:

We have yet to learn how to leverage the knowledge and context of people without data knowledge in general in the data and analytics space. This is what data mesh tries to unlock but we are still figuring it out.

There are good incentives for teams to produce high quality and reliable data but you have to work with them closely to explain it.

The concept of data lake was to only invest in cleaning and maintaining the data when there was a clear use-case, a clear reason to invest that time. But it was clean-up, not proactive cleaning, and typically had opaque and/or mediocre ownership – that made it much harder to derive the value.

Vendors have yet to really validate data mesh and that means many folks are still sitting on the sidelines. It will be interesting to see if vendors really can ever validate it given how complex and large in scope data mesh really is.

To do data mesh right, many stakeholders need to parse the principles – or at least what the principles are trying to achieve – and then, crucially, adapt them to your culture. Data mesh can’t be about cutting and pasting from someone else’s implementation.

Shared ownership of data is very hard. That seems obvious but even within the domain, there is shared ownership between the subject matter experts and those shaping the data to share externally. There needs to be strong communication and a good relationship between those parties.

Really spend the time to consider what skillsets you actually need. And when you will need them. It’s okay to have more basic data products in the early days of a mesh implementation as developers learn how to work with data properly.

Gretchen’s LinkedIn: https://www.linkedin.com/in/gretchenmoran/

NGS’ current openings: https://ngs.wd1.myworkdayjobs.com/ngs_external_career_site

Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him at community at datameshlearning.com or on LinkedIn: https://www.linkedin.com/in/scotthirleman/

If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/

If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see here

All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, and/or nevesf

Data Mesh Radio is brought to you as a community resource by DataStax. Check out their high-scale, multi-region database offering (w/ lots of great APIs) and use code DAAP500 for a free $500 credit (apply under “add payment”): AstraDB

Leave a Reply

Your email address will not be published.