Dremio, the easy and open data lakehouse, has published “The Data Lakehouse: Data Warehousing and More,” a novel research paper now available on arXiv. The paper explores the data lakehouse model, offering modern insights for businesses looking to optimize their data utilization. The idea through this preprint publication is to gather feedback from the open source research and scientific community and make it available to the wider community of practitioners.
The paper decomposes commonly used but overloaded terms like data warehouse, data warehousing, and data lakehouse into discrete components (such as query engine, table format, etc.), then offers clear terms and definitions based on these components to bring clarity to communication that uses these common terms.
Data warehousing has long been a cornerstone of modern data-driven organizations, serving as a strategic asset for informed decision-making. However, the emergence of data lakehouses has challenged the traditional paradigms by providing a new approach to achieving the goals of data warehousing, while overcoming its limitations and adding new dimensions of capability.
The paper begins by rigorously defining the often-ambiguous terms “data warehousing,” “data warehouse,” and “data lakehouse.”
“We have seen some folks in the market say that ‘data lakehouse’ is just another marketing buzzword. We understood the arguments on both sides, but whether the statement was right or not was essentially rooted in how you interpret certain terms. We wanted to provide some clarity and, using an approach similar to a math-based proof, show that with clarity on definitions, ‘data lakehouse’ is definitely more than a marketing term. Rather, it’s a practical and valuable approach to data warehousing,” said Jason Hughes, director of technical advocacy.
The paper breaks down what is commonly referred to as “data warehousing” into its fundamental requirements, categorizing them into technical components, technical capabilities, and technology-independent practices. It then shows how a data lakehouse addresses all of these core requirements, therefore demonstrating that a data lakehouse can be used to achieve what is traditionally thought to require an RDBMS-OLAP. It also highlights the shortcomings of traditional data warehousing on RDBMS-OLAP, including limitations with semi-structured and unstructured data, lock-in and lock-out, and cost issues, prompting a reevaluation of architectural approaches. Additionally, the paper provides a concrete example of data lakehouse implementation to demonstrate its practical benefits.
The ultimate goal of a data lakehouse is to combine the strengths of RDBMS-OLAP data warehousing and data lakes, fulfilling data warehousing requirements on open data architecture and expanding to additional analytic capabilities.
Dremio’s research paper solidifies the concept of data lakehouses and provides a practical roadmap for organizations looking to harness the full potential of their data while optimizing their data architecture.
To read the full paper please visit: https://arxiv.org/abs/2310.08697
Sign up for the free insideBIGDATA newsletter.
Join us on Twitter: https://twitter.com/InsideBigData1
Join us on LinkedIn: https://www.linkedin.com/company/insidebigdata/
Join us on Facebook: https://www.facebook.com/insideBIGDATANOW
Leave a Reply