#34 Tackling Challenges Together at Talkdesk: An Early Journey Story – Interview w/ José Cabeda

Sign up for Data Mesh Understanding’s free roundtable and introduction programs here: https://landing.datameshunderstanding.com/

Please Rate and Review us on your podcast app of choice!

If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see here

Episode list and links to all available episode transcripts here.

Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.

Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here (info gated)

José’s LinkedIn: https://www.linkedin.com/in/jecabeda/

José’s Twitter: @jecabeda / https://twitter.com/jecabeda

In this episode, Scott interviewed José Cabeda, Data Engineer at Call-center-as-a-service provider Talkdesk. They talked about Talkdesk’s start to their data mesh journey and progress so far.

When José came across Zhamak’s original post, it spoke to a number of the challenges Talkdesk was facing, checking many of the boxes to where they wanted to head. The team started from a single data product and iterated from there. While they are still relatively early in their journey, like every company, they have advanced far past their initial use case.

At Talkdesk, a data product is typically a single table or view in Snowflake but the company’s North Star is event streaming as their key information storage and sharing mechanism. However, it was sometimes difficult to train people to understand the difference between a business event – something that occurred in the real world – and an event streaming event.

José had a few key takeaways and recommendations for those implementing data mesh:

1. Change will be constant in a data mesh implementation so it is best to standardize the way people and systems will interact as much as possible. Define expectations!

2. Be open to new ideas, there are many challenges ahead so it’s best to face them together.

3. Use a single universal ID for major concepts like account or business events to make interoperability easier / possible.

4. Don’t be afraid to slice your data in different ways to serve different use cases.

5. To drive buy-in, start with a single use case, whether that is a data product or multiple data products – most people recommend 2-3 data products in your PoC – so you can show why data mesh is a good idea.

Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/

If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/

If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see here

All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

Leave a Reply

Your email address will not be published. Required fields are marked *