Blog: How a data platform and federated data & analytics model can help you win the Tour de France pool.

Blog: How a data platform and federated data & analytics model can help you win the Tour de France pool.

2023-06-29 Blog Dave van den HurkThursday 29 June 2023 10:37

I like data, I like sports, I like playing games and I generally hate losing. Consequently, this cocktail makes entering the annual Tour de France pool a serious business for me.

Anyone who thought that, when putting together my team of cyclists, I limit myself to reading some sports news supplemented by relying on my intuition: think again.

It starts with a thorough analysis of the course: what do the 21 day stages look like? How many time trial kilometres, flat arrivals, mountain stages, elevation metres, and uphill arrivals are there? Then, crucially: which cyclists are in the right form? What were the results of the spring classics? Who rides which preparation race, how tough is the competition and what results are achieved in it? Of course, the results of previous editions of grand tours cannot be left out either, including those of the Giro and the Vuelta. Are there positive trends hidden in these that my competitors overlook, with which I can outsmart them? Are there any clues in the team presentations? Which teams look the strongest? What is being said in the various interviews? Who is going for the GC, who needs to be a domestique and who is mainly going for day success?

In case you think the data within your company is hidden in silos, try putting yourself in my annual data collection in preparation for my Tour de France pool. But there is hope on the horizon. The popularity of the Tour de France pool has taken off. A few years ago, for instance, I discovered a genuine cycling data paradise. A website where almost everything was kept centrally and made available in raw form. You could unleash your own queries and analysis on that data to come up with your own insights and choices. There were even people who developed complex predictive algorithms and made the results available as data enrichment.

Notice here the bridge from cycling-pool fetishism to a hot topic in the data field: The interplay of a (cloud) data platform and the federated data & analytics organisational model. Here, decentralised analytics teams are, through a central data platform, and on the basis of joint governance agreements are facilitated to work autonomously on their own data products. In turn, these data products can be shared with other decentralised analytics teams.

For health insurer CZ, as for most other companies, data is essential. Data helps us, for example, to improve our customer experience, in cooperating with healthcare providers and regulators and in detecting fraud. Our cloud data platform, based on the innovative concept of a Lakehouse offers the perfect solution here. This platform serves as an enabler for federated data analytics teams, so that value creation from data can take place within our business domains. So what makes this interplay so powerful for us?

  • Integrated Data Lake and Data Warehouse functionalities

A Lakehouse combines the power of a Data Lake and a Data Warehouse in one platform, while source data is accessed and processed only once. Federated analytics teams can then access both unstructured and structured data, while having the flexibility to explore, transform and analyse the data in a unified and streamlined way. This enables teams to quickly generate insights and take advantage of the rich information present in the Data Lake.

  • Real-time data access and processing

The data platform gives us the ability to access, process and analyse data on a high-activity, real-time basis if required. This enables us to act proactively within an environment of changing conditions.

  • Resilient data management and data governance

The Lakehouse concept places a strong emphasis on data management and data governance. For example, federated analytics teams can use automated processes for data management, data quality and data security. This ensures efficient and reliable data pipelines that meet the highest data protection and compliance standards. Analytics teams can thus focus on analytics and value creation with confidence.

  • Scalability and flexibility for future growth

A Lakehouse on a cloud data platform provides scalability and flexibility for future growth. Analytics teams can easily scale to larger data sets and more complex analytics such as Machine Learning and Artificial Intelligence, without having to worry about infrastructure limitations.

In summary, the interplay between a cloud data platform and the innovative Lakehouse concept provides a powerful environment for federated analytics teams to work autonomously with data. Meanwhile, with the Tour de France just around the corner, I too am busy as a federated analytics team analysing the latest updates on the race. Since entering cycling data paradise, this preparation has become increasingly streamlined and I am steadily creeping towards the top of the Tour de France pole ranking. Fortunately, unpredictability also continues to play a role. And fortunately, ChatGPT does not yet write a race report as a preview. With that reassuring thought, I foresee a bright Tour de France pool future. With data platform...!

Want to learn more about the federated data organisation and platform, the solution for central vs. decentralised? Then sign up for the CEG Data on September 5th at CZ (in collaboration with PostNL) in Tilburg. We will share knowledge and experiences on technology, governance organisation, competences, capacity and ownership, among others. See you there!

Dave van den Hurck
Product Owner Datalake Platform, CZ Zorgverzekeringen

Tags
CIO Platform Nederland

« Back

Close