Gnarly Data Waves

GDW Special Edition 7

|

September 17, 2024

An Apache Iceberg Lakehouse Crash Course – The Role of Apache Iceberg Catalogs

Learn the importance of catalogs in Apache Iceberg: - Different catalog options (Nessie, Polaris, AWS Glue) - How catalogs facilitate data management and discovery - Integrating catalogs with your existing data infrastructure - Practical usage scenarios and tips

Learn how to optimize Apache Iceberg tables for better performance and cost efficiency. Topics include: – Setting up streaming pipelines with Apache Iceberg. – Real-time data ingestion and processing. – Basics of different solutions like Kafka Connect, Flink and Upsolver

An Apache Iceberg Lakehouse Crash Course is a 10 part web series designed to help you master Apache Iceberg, a powerful tool in the data lakehouse architecture.

Our expert-led sessions will cover a wide range of topics to build your Apache Iceberg expertise.  Each session will offer detailed insights into the architecture and capabilities of Apache Iceberg, along with practical demonstrations.

Whether you’re a data engineer, architect, or analyst, this series will equip you with the knowledge and skills to leverage Apache Iceberg for building scalable, efficient, and high-performance data platforms.

Watch or listen on your favorite platform

Register to view episode

Ready to Get Started? Here Are Some Resources to Help

AnalystReports Thumb

Analyst Report

Preparing and Delivering Data for AI

read more
Infographics Thumb

Infographic

AI Leader or Follower? Benchmark Your AI Readiness

read more
AnalystReports Thumb

Analyst Report

Navigating the Shift to a Lakehouse Architecture

read more
get started

Get Started Free

No time limit - totally free - just the way you like it.

Sign Up Now
demo on demand

See Dremio in Action

Not ready to get started today? See the platform in action.

Watch Demo
talk expert

Talk to an Expert

Not sure where to start? Get your questions answered fast.

Contact Us

Ready to Get Started?

Enable the business to accelerate AI and analytics with AI-ready data products – driven by unified data and autonomous performance.