4 minute read · June 18, 2020
Why I Joined Dremio to Lead Engineering
· Executive Vice President of Engineering, Dremio
TL;DR I joined Dremio to become part of an amazing team that is building next-generation architecture, purpose-built for the exploding trend toward cloud data lake storage, such as AWS S3, Microsoft ADLS and other data sources, that will empower enterprises to harness data for faster insights in a cost-effective and compliant way.
The Mission Data-driven decisions are becoming the new normal and a strategic priority for every C-level executive. In response to the COVID-19 pandemic, executives and decision-makers are looking for faster data insights to drive efficiency across their business and decision-making processes.
The result is an explosion in data volume generation and types of data. Unfortunately, current generation technologies can be expensive, wasteful and slow, making it difficult for enterprises to become data driven. A powerful emerging trend is to store data in cloud data lakes to achieve cost efficiency with low-cost storage; however, this solves just one piece of the puzzle. The rest of the stack needs to be reimagined, because current technologies and approaches such as data warehouses, ETL, cubes, etc. reduce agility and adaptability, and increase costs exponentially.
This is where Dremio comes in. Dremio offers the industry’s leading data lake engine, delivering high-speed query performance together with a self-service semantic layer that makes it easy for even nontechnical users to access and analyze data. It does this in a way that maintains the flexibility that’s inherent in data lake storage; you don’t have to copy and move your data to a third party or put it in a proprietary format.
Dremio gives Fortune 2000 and mid-size enterprise customers the power to harness insights directly from their data lakes and make data-driven decisions quickly, disrupting a 25 billion+ market. Now, this is a mission that I can get behind!
Product and Technology Opportunity
Imagine the opportunity to work at the intersection of (1) harnessing data and compute at massive scale, (2) building an enterprise-grade public cloud platform/services, (3) enabling innovation to revolutionize open data lake technologies, and (4) contributing back to the open source community to foster innovation. This is the opportunity that drew me to Dremio.
Over the last 20 years I have had the privilege to be part of teams at VMware, Oracle and Instart that have built and operated software platforms at massive scale. Providing infrastructure and services to support hundreds of thousands of enterprises is deeply rewarding and ever so humbling. The scale pushes you in new and creative ways every day. It pushes you to make the impossible possible every day.
The ambition at Dremio is larger; and given the nature of ever-growing data volumes, big data, analytical insights and AI/ML, our technical challenges are much more complex. This requires us to think big, innovate and push through the boundaries of what technology can do today. We are doing this by delivering purpose-built cloud-native platforms, offering services that perform with unbounded scale and get billed with engine and query-level granularity, and contributing to open source technologies like Apache Arrow, Gandiva, Arrow Flight, Calcite, Parquet, and Iceberg on data formats, data processing, storage, caching, query processing, orchestration, frameworks and user interfaces.
The Team
Not only does Dremio have the smartest minds who have pioneered several industry impacting data technologies, they have built an amazing team across all disciplines. Add to that a culture that values collaboration, authenticity, openness and innovation and you have a rich environment to make a career-defining impact.
If you found any of this intriguing, ping me on LinkedIn or email me at [email protected]. We are looking for engineers and engineering leaders who are passionate about cloud-scale distributed systems, systems software, cloud infrastructure, platforms, big data, data analytics, site reliability, security, CI/CD tools, web frameworks, user interfaces and APIs.
Join us in this journey of democratizing data analytics and insights.