

Dremio today announced that the metadata catalog at the heart of its Apache Iceberg-based data lakehouse now supports other popular metadata catalog services, including Snowflake’s Apache Polaris-based catalog and Databricks Unity Catalog. The lakehouse provider says the move in its Project Nessie-based metadata catalog will bolster architectural flexibility in the cloud, on-prem, and everywhere in between.
Before metadata catalogs suddenly jumped into the big data consciousness earlier this year, Dremio had been quietly backing its own metadata catalog, dubbed Project Nessie, to provide the necessary housekeeping that a lakehouse based on Apache Iceberg tables requires.
So when Snowflake announced the open source Polaris metadata catalog during its user conference in early June, Dremio executives applauded the announcement and the openness that it could foster in the big data community. Seeing close alignment between Polaris and Nessie, which began development in 2020, Dremio executives pledged to work with the Polaris community to merge the two projects.
The Nessie-Polaris merger has yet to happen, but it is still in the plans. “Our goal is to merge the capabilities of Project Nessie into Apache Polaris (Incubating) to create a single, unified catalog,” says James Rowland-Jones, vice president of product at Dremio. “We believe this will become the default catalog for the open-source community. Dremio will continue to focus on seamless enterprise services built around it.”
In the meantime, Dremio is moving forward with development its own catalog service for technical metadata, dubbed the Dremio Enterprise Data Catalog. Specifically, Dremio today announced several new capabilities in the metadata catalog, which is based on Nessie.
The new bits include integration with the Snowflake metadata catalog service based on Apache Polaris as well as hooking into Unity Catalog, the metadata catalog that Databricks built for managing data stored in Delta Lake tables (Unity Catalog does quite a bit more, including lineage tracking, semantic modeling, security, governance, and functions as a regular, user-focused data catalog, but that’s another story).
Dremio’s move is noteworthy for a couple of reasons. For starters, with its acquisition of Iceberg maker Tabular for between $1 billion and $2 billion and its commitments to essentially merge the Delta Lake and Iceberg specs, Databricks helped to ease CFOs who were worried that they would pick the “wrong” format.
However, while Databricks committed earlier this year to supporting Iceberg tables with a future release of Unity Catalog, that support is not available yet. Dremio’s support for Unity Catalog ensures that Databricks customers who use its metadata catalog can achieve that interoperability with Polaris today.
“Flexibility is essential for modern organizations looking to maximize the value of their data,” said Tomer Shiran, Founder of Dremio. “With expanded Iceberg catalog support across all environments, Dremio empowers businesses to deploy their lakehouse architecture wherever it’s most effective. We’re 100% committed to giving customers the freedom to choose the best tools and infrastructure while reducing fears of vendor lock-in.”
Dremio’s product, which is officially called the Dremio Enterprise Data Catalog for Apache Iceberg, supports all Iceberg engines through the Iceberg REST API. In addition to supporting Dremio’s own SQL query engine, it supports other Iceberg-compatible query engines, including Apache Spark, Flink, and others.
Dremio’s catalog automates many of the housekeeping tasks that are required to keep an Iceber-based data lakehouse running at peak efficiency. That includes things like table optimization routines, such as compaction and garbage collection. It also provides “Git”-like branching and version control, enabling users to access data as it existed at particular moments in time (so-called “time travelling”). The catalog also provides centralized data governance and role-based access control (RBAC), ensuring fine-grained access to data and preventing user access to of sensitive data.
Kevin Petrie, vice president of research at BARC, says Dremio’s move helps enterprises deal with the “extraordinary pressure to access, prepare, and govern distributed datasets for consumption by analytics and AI applications.”
“To meet this demand, they need to catalog diverse data and metadata across data centers, regions, and clouds,” Petrie said in Dremio’s press release. “Dremio is taking a logical step to enable this with an open catalog that is based on Apache Iceberg, the emerging standard for flexible table formats, and by integrating with an ecosystem of popular platforms.”
Related Items:
Polaris Catalog, To Be Merged With Nessie, Now Available on GitHub
What the Big Fuss Over Table Formats and Metadata Catalogs Is All About
June 13, 2025
- PuppyGraph Announces New Native Integration to Support Databricks’ Managed Iceberg Tables
- Striim Announces Neon Serverless Postgres Support
- AMD Advances Open AI Vision with New GPUs, Developer Cloud and Ecosystem Growth
- Databricks Launches Agent Bricks: A New Approach to Building AI Agents
- Basecamp Research Identifies Over 1M New Species to Power Generative Biology
- Informatica Expands Partnership with Databricks as Launch Partner for Managed Iceberg Tables and OLTP Database
- Thales Launches File Activity Monitoring to Strengthen Real-Time Visibility and Control Over Unstructured Data
- Sumo Logic’s New Report Reveals Security Leaders Are Prioritizing AI in New Solutions
June 12, 2025
- Databricks Expands Google Cloud Partnership to Offer Native Access to Gemini AI Models
- Zilliz Releases Milvus 2.6 with Tiered Storage and Int8 Compression to Cut Vector Search Costs
- Databricks and Microsoft Extend Strategic Partnership for Azure Databricks
- ThoughtSpot Unveils DataSpot to Accelerate Agentic Analytics for Every Databricks Customer
- Databricks Eliminates Table Format Lock-in and Adds Capabilities for Business Users with Unity Catalog Advancements
- OpsGuru Signs Strategic Collaboration Agreement with AWS and Expands Services to US
- Databricks Unveils Databricks One: A New Way to Bring AI to Every Corner of the Business
- MinIO Expands Partner Program to Meet AIStor Demand
- Databricks Donates Declarative Pipelines to Apache Spark Open Source Project
June 11, 2025
- What Are Reasoning Models and Why You Should Care
- The GDPR: An Artificial Intelligence Killer?
- Fine-Tuning LLM Performance: How Knowledge Graphs Can Help Avoid Missteps
- It’s Snowflake Vs. Databricks in Dueling Big Data Conferences
- Snowflake Widens Analytics and AI Reach at Summit 25
- Top-Down or Bottom-Up Data Model Design: Which is Best?
- Why Snowflake Bought Crunchy Data
- Inside the Chargeback System That Made Harvard’s Storage Sustainable
- Change to Apache Iceberg Could Streamline Queries, Open Data
- dbt Labs Cranks the Performance Dial with New Fusion Engine
- More Features…
- Mathematica Helps Crack Zodiac Killer’s Code
- It’s Official: Informatica Agrees to Be Bought by Salesforce for $8 Billion
- AI Agents To Drive Scientific Discovery Within a Year, Altman Predicts
- Solidigm Celebrates World’s Largest SSD with ‘122 Day’
- DuckLake Makes a Splash in the Lakehouse Stack – But Can It Break Through?
- The Top Five Data Labeling Firms According to Everest Group
- Who Is AI Inference Pipeline Builder Chalk?
- IBM to Buy DataStax for Database, GenAI Capabilities
- ‘The Relational Model Always Wins,’ RelationalAI CEO Says
- VAST Says It’s Built an Operating System for AI
- More News In Brief…
- Astronomer Unveils New Capabilities in Astro to Streamline Enterprise Data Orchestration
- Yandex Releases World’s Largest Event Dataset for Advancing Recommender Systems
- Astronomer Introduces Astro Observe to Provide Unified Full-Stack Data Orchestration and Observability
- BigID Reports Majority of Enterprises Lack AI Risk Visibility in 2025
- Gartner Predicts 40% of Generative AI Solutions Will Be Multimodal By 2027
- Databricks Announces Data Intelligence Platform for Communications
- MariaDB Expands Enterprise Platform with Galera Cluster Acquisition
- Databricks Unveils Databricks One: A New Way to Bring AI to Every Corner of the Business
- Databricks Announces 2025 Data + AI Summit Keynote Lineup and Data Intelligence Programming
- FICO Announces New Strategic Collaboration Agreement with AWS
- More This Just In…