
Vendors Rush to Adopt MCP, the USB-C Cord for AI Integration

When Anthropic unveiled Model Context Protocol (MCP) last fall, it didn’t make a whole lot of headlines. Nearly six months later, the new protocol is the talk of the town as one data infrastructure vendor after another announces support for the protocol, which was designed to simplify the integration of AI models, data sources, and AI tools.
The core idea behind MCP is fairly simple, according to Anthropic. Instead of building and maintaining individual connectors for each data source, AI application developers can utilize a single, open protocol to use with all of the connections.
“Think of MCP like a USB-C port for AI applications,” Anthropic wrote in an introduction to MCP. “Just as USB-C provides a standardized way to connect your devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools.”
MCP uses a lightweight client-server architecture. The server component of MCP resides on databases, file servers, or enterprise applications, such as a CRM system, while the MCP client resides in the AI application, such as Anthropic’s Claude desktop or other application.
After a somewhat slow start over the winter, MCP support is picking up in a big way, particularly in the past month. Here are some of the vendors now supporting MCP:
CData
One of the companies launching support for MCP is CData, the unheralded software vendor that builds and supports many of the database connectors that database vendors license to do the hard work of exposing their particular database to the outside world using standard interfaces and protocols, like ODBC and SQL, among others.
With its new MCP Servers offering, which is currently in beta, CDdata is supporting MCP as one of the transformation protocols it supports on the downstream side of its data adapters. It doesn’t yet support MCP in each of the 350-plus adapters that it builds and maintains, but the company says that is the plan.
“We’ve wrapped our connectors in an MCP server that exposes SQL-based tools to LLMs,” CData CEO and Co-founder Amit Sharma explains in a recent blog post. “This isn’t a one-off integration. It’s a general-purpose interface that allows LLMs to introspect business data, write and execute SQL, and even perform write operations–all on behalf of the user.”
Dremio
Dremio this week announced support for MCP with its data lakehouse platform. The company says the launch of Dremio MCP Server will enable “native integration” to AI models, such as Anthropic’s Claude.
Dremio says the combination of its own semantic layer with MCP support will enable AI agents to explore datasets, generate queries, and retrieve governed data in real time, while maintaining context across workflows.
Dremio’s implementation of MCP enables Claude to extend its reasoning capabilities directly to an organization’s data assets, unlocking new possibilities for AI-powered insights while maintaining enterprise governance,” said Mahesh Murag, an Anthropic product manager.
StarTree
Real-time analytics database provider StarTree is also getting in on the MCP action. Last week, the company announced that its forthcoming support for MCP will enable the StarTree database to serve data to “swarms of AI agents” demanding access to the latest, freshest data. The new release also offers support for hosting vector embeddings, which will bolster its prospects in serving data as part of retrieval augmented generation (RAG) workflows.
StarTree is an enterprise version of Apache Pinot, a real-time analytics database that is known for its high-speed data ingestion and indexing functions as well as the capability to serve large numbers of SQL queries simultaneously. Kishore Gopalakrishna, StarTree’s co-founder and CEO, says the new MCP and vector embedding capabilities will help StarTree customers push real-time data out to AI models and agents.
“The next wave of AI innovation will be driven by real-time context—understanding what’s happening now,” Gopalakrishna said. “StarTree’s heritage as a real-time analytics foundation perfectly complements where AI is going by delivering fresh insights at scale. What is changing is the shift from apps as the consumer to autonomous agents.”
MindsDB
Database developer MindsDB is adopting MCP with both its open source and enterprise platforms. The idea is to positions MindsDB “as a unified AI data hub that standardizes and optimizes how AI models access data, dramatically simplifying artificial intelligence deployment in complex data environments,” the company said in an April announcement.
MindsDB CEO Jorge Torres says his company’s MCP support will help to expose the right piece of data to the right model at the right time with the right security controls, thereby helping to “eliminate the data sprawl that is posing a threat to enterprise AI innovation,” he said.
Docker
Software container provider Docker last month announced MCP support in two new products, including Docker MCP Catalog and Docker MCP Toolkit. Docker MCP Catalog will provide developers with a centralized way to discover verified and curated MCP tools within the Docker Hub, the company says, while Docker MCP Toolkit will let developers run, authenticate, and manage MCP tools using their standard Docker interface.
The new Docker offerings will allow developers to build AI and agentic applications and deploy them as standard Docker containers. Docker says it will let “developers put AI to work without reinventing their workflow.”
Docker is also building developer ecosystem for MCP tools along with Elastic, Salesforce, and New Relic (among others) to do this. “Docker is positioning itself at the intersection of containerization and AI, where speed, consistency, and security are essential,” said Paul Nashawaty, Practice Lead and Principal Analyst at theCUBE Research.
Cloudflare
Cloud provider Cloudflare last month announced that it’s also supporting MCP. The company says its launching a remote MCP server that will enable developers to build agentic AI applications that can connect over the Internet and interact with services, like email, “without the need for a locally hosted server.”
Its supporting MCP servers running as Durable Objects, which is a “special type of Cloudflare Worker that combine compute with storage, allowing you to build stateful applications in a serverless environment without managing infrastructure.” In this manner, MCP servers running on Cloudflare can retain context, which the company says will provide a more persistent and better experience for users.
“Cloudflare offers a developer-friendly ecosystem for creating AI agents that includes a free tier for Durable Objects and serverless options for AI inference,” explains
By providing a free tier for Durable Objects, Cloudflare is providing a developer-friendly ecosystem for creating AI agents, says Kate Holterhoff, a senior analyst at RedMonk. “These low-cost, easy-to-use options could empower more businesses to adopt and experiment with agentic AI,” she added.
Google Cloud
Google Cloud announced support for MCP during its Next 2025 conference last month. Specifically, it’s supporting MCP in its Agent Garden, a component of its AI offering where customers can browse ready-to-use samples of AI applications and then start building with them in its SDK.
Google Cloud also used the show to announce its own Agent to Agent (A2A) protocol, which is geared at enabling agents to call and connect to other agents, as opposed to AI models and tools, which is the focus with MCP, according to Amin Vahdat, the company’s vice president of ML, systems and cloud AI.
MinIO
MinIO, the developer of the S3-compatible object store of the same time, is also supporting MCP. Last month, the company announced support the preview of MCP in AIStor, the enterprise version of its open source object storage system.
The MCP capability will enable agents to “talk” to MinIO data through an MCP server, MinIO co-founder and co-CEO AB Periasamy said. MinIO also announced that it’s building administration capabilities into its MCP offering to prevent unauthorized access from MCP clients, as well as introducing monitoring capabilities for MCP server.
Databricks
Databricks hasn’t officially announced support for MCP, and searches for MCP on its website turn up empty. But it appears the company is supporting the protocol (or close to supporting it) just the same.
According to the description of an upcoming session on Mosaic AI Agent Framework at the company’s Data + AI Summit (June 9-12 in San Francisco), MCP is a supported protocol in AI agents that are callable from its AI Playground, while support for MCP in Unity Catalog provides governance and security for enterprise deployments.
There’s also an open source Databricks MCP Server available on GitHub that supports the Databricks API and allows LLMs and AI agents to run SQL queries and otherwise access Databricks environments.
There are a number of other MCP implementations out there among data platforms, clouds, database vendors, and others. As interest in agentic AI builds, developers will naturally gravitate towards tools that let them accomplish tasks with the least amount of cost and complexity. It’s not totally clear that MCP will server as the data integration standard for agentic AI and GenAI in the long run. But in the short term, it’s definitely off to a good start.
Related Items:
Will Model Context Protocol (MCP) Become the Standard for Agentic AI?
The Future of AI Agents is Event-Driven
LLMs and GenAI: When To Use Them