Scaling Up Apache Airflow to Enterprise Level

Rabobank the Netherlands is building an enterprise-grade data mesh. Our team was tasked with implementing Apache Airflow as the de facto orchestration tool. For hundreds of teams.

Looking back, we can finally say we did it. Airflow is now hosting over 50 teams, running smoothly, linking data delivery streams with data consumption streams. However, we did not get to this point without making our fair share of mistakes.

This session will cover every step of our journey, including: dealing with user team proficiency (or the lack thereof), struggling with the Kubernetes Executor, trying our hand at zero-downtime deployments, getting our scaling right, and fighting with the PostgreSQL database backend.

Download PDF

get started

Get Started Free

No time limit - totally free - just the way you like it.

Sign Up Now
demo on demand

See Dremio in Action

Not ready to get started today? See the platform in action.

Watch Demo
talk expert

Talk to an Expert

Not sure where to start? Get your questions answered fast.

Contact Us

Ready to Get Started?

Enable the business to create and consume data products powered by Apache Iceberg, accelerating AI and analytics initiatives and dramatically reducing costs.