Follow BigDATAwire:

June 3, 2025

Snowflake Widens Analytics and AI Reach at Summit 25

Snowflake unleashed a torrent of news today at its annual user conference, ranging from a new adaptive compute capability and Openflow data integration to new analytic capabilities and AI enhancements, such as AI SQL and new AI agents. The company also announced its intent to acquire Crunchy Data, a provider of Postgres databases and services.

Snowflake and 16,000 of its customers are taking over the Moscone Center in San Francisco to host Summit. Previously called Data Cloud Summit, Snowflake used its show as the launchpad for several announcements, including its intent to acquire of Crunchy Data, which runs a hosted version of Postgres and develops a Postgres-based engine that works with Parquet data. For more on Crunchy, check out the article we wrote about them last July.

Meanwhile, back at Summit, Snowflake made announcements in five key areas, including its platform, analytics, data engineering, AI and machine learning, and applications and collaboration. Here’s a brief rundown of the new stuff in each category:

Snowflake Platform

On the platform side, Snowflake is announcing its new adaptive compute functionality, which allows Snowflake to manage the resources used for customer queries. Instead of customers choosing T-shirt sizes for their data warehouses, customers just choose one size, and Snowflake automatically scales the compute resources behind the scenes to adapt to the changing workload.

The new service will save customers money, says Snowflake EVP Christian Kleinerman.

Snowflake is launching a new adaptive compute option for its data warehouse (Image courtesy Snowflake)

“We are improving materially the price performance of the platform at Snowflake, and by virtue of simplifying the allocation of queries to compute resources, we are also helping customers make better utilization of the compute resources that they have allocated,” he says. “We’ll start with an adaptive warehouse concept, and over time we will include code execution clusters as a way to simplify and unify our compute models.”

Snowflake has also launched a Gen2 data warehouse that brings a 2X to 4X performance boost for write-heavy and update-heavy workloads. “We’re incredibly excited that with these two combinations, we continue to position snowflake at the forefront of not only performance, but price performance, and most important, with very high ease of use,” Kleinerman said.

Data Engineering

On the data engineering front, Snowflake is releasing a new data ingestion offering dubbed Openflow. Based on open source Apache NiFi and Snowpipe Streaming, Openflow will allow users to connect to a wide range of structured or unstructured data source, such as relational and NoSQL databases, streaming data services like Kafka, or even PDF documents, and bring the data into Snowflake.

Openflow features a low-code visual interface where customers can build data connections using any of the 30-plus pre-built connectors, or build their own connectors. The software will features a directed acyclic graph (DAG) visualization to help users understand how data is flowing in their environment. “There’s a big emphasis on Openflow in the topic of observability, so customers can easily see what’s going on, what data is flowing, where there may be data issues or data quality gaps,” Kleinerman said.

Openflow can run as a fully managed service that honors and runs with the governance and security guarantees that are built into Snowflake, Kleinerman said. In addition, Snowflake will also support a bring-your-own-compute version of Openflow that can run in customers’ virtual private cloud (VPC) environments, where customers manage the data movement.

“Of course, most of our customers are interested in landing data into Snowflake or making data available to snowflake, but as a generic framework, there are no restrictions,” he said. “Our goal is to be able to simplify data movement and data processing from any one source to any other destination.”

Snowflake Openflow is a data integration tool based on Apache Nife (Image courtesy Snowflake)

Better integration with dbt is also on the docket. Soon there will be a public preview of a new offering dubbed dbt Projects that will allow customers to use dbt within Snowflake Workspaces, thereby taking advantage of dbt goodies, such as code assistance, native Git integration, and side-by-side differencing. That will be followed with support for dbt Fusion, the new Rust-powered version of dbt that was announced last week.

Analytics

On the AI and machine learning front, Snowflake is using Summit 25 to launch something called Cortex AISQL, which is a series of extensions to Snowflake’s SQL dialect designed “to make AI operations as simple as a function call,” Kleinerman said.

With AISQL, less technical users will be able to conjure up AI-powered functions, such as extracting information from a PDF document or an audio file, with just a simple API or function call, he said.

“What this does is it brings AI to the broad user base that Snowflake has had over the years with a simple and intuitive programming model,” Kleinerman said. “Again, [it’s] leaning very strongly on our goal that platforms to process data need to be easy and simple.”

Snowflake says Cortex AISQL will allow customers to create complex, multi-step AI workflows using the power of SQL. For instance, a question like “What is the yearly revenue growth and market outlook for companies that have recently undergone a CEO change and operate in the renewable energy sector?” will require a series of steps, including searching documents, filtering the criteria, extracting key information, joining unstructured insights, and synthesizing the findings.

“By allowing analysts to chain together AI-powered operations, from document filtering to semantic extraction to intelligent joining, all within familiar SQL syntax, it eliminates the need for multiple specialized tools and custom code,” Snowflake says in a blog. “This unified approach transforms what would traditionally require data science expertise and weeks of development into straightforward SQL queries that business analysts can build and modify in minutes.

AI and ML

Snowflake Cortex AISQL lets users call AI routines embedded in SQL (Image courtesy Snowflake)

On the AI and machine learning front, Snowflake made several announcements, including Snowflake Intelligence, a new application that provides a conversational interface for business users and data professionals to ask natural language questions and get AI-powered answers.

Snowflake Intelligence, which will be in public preview soon, will provide a shrink-wrapped method for bringing foundation models from OpenAI and Anthropic to apply against customers’ data. Instead of forcing customers to build their own conversational interface, Snowflake Intelligence handles all the connections as well as security and data governance.

Snowflake is launching prebuilt agents too. The Data Science Agent will help automate tedious ML tasks and help troubleshoot their ML workflows. The Data Science Agent is based on Anthropic’s Claude model and works with ML pipelines developed in Snowflake Notebook.

The company is also unveiling SnowConvert AI, a new solution for accelerating the migration of customers’ competing data warehouses to Snowflake. The offering utilizes Cortex AI capabilities to convert code, BI reports, ETL, and other data warehouse components to Snowflake code, thereby alleviating the burden on data engineers.

Apps and Collab

On the applications and collaboration front, Snowflake is also making procured data feeds from data suppliers such as CB Insights, Packt, Stack Overflow, The Associated Press, and USA TODAY available via Cortex Knowledge Extensions. These will be generally available soon.

Finally, Snowflake is also making some news in the semantic arena. Semantic models commonly are developed as part of the data modeling effort within a business intelligence or analytics tool, and help ensure that metrics are consistent and everyone is on the same page. By allowing users to share semantic models, Snowflake is hoping to drive some semantic standardization for the data it shares on the Snowflake Marketplace.

Snowflake Summit runs through Thursday.

Related Items:

Crunchy Data Goes All-In With Postgres

It’s Snowflake Vs. Databricks in Dueling Big Data Conferences

Snowflake Unleashes AI Agents to Unlock Enterprise Data

BigDATAwire