
MCP Roundup: Why Database Providers Are Rebuilding Around Context

(Shutterstock AI Image)
The shift from experimental AI prototypes to real-world, integrated systems is exposing a critical gap in enterprise data infrastructure: context. As businesses aim to deploy AI agents that act autonomously and intelligently, they’re finding that raw data and fast compute alone aren’t enough.
What’s missing is a way for those systems to stay aware of the user, the task, the tools involved, and the data behind them. This is where MCP, or Model Context Protocol, is starting to come into focus.
This shift is becoming especially visible in the database world. Databases are no longer just the place where data lives. They are becoming part of how AI understands the environment it is working in. With support for MCP, databases can help AI agents access and act on information with more continuity and less guesswork.
Originally introduced by Anthropic in late 2024, MCP provides a standard way for AI models to connect with external tools and data sources while holding on to relevant context.
At its core, MCP is about giving models memory. Not just remembering the last message, but holding on to relevant context such as which app a user is in, what kind of data they are working with, and what they are trying to get done. Instead of starting from scratch with every prompt, the model can begin already informed.
Oracle is one of the latest players to move in this direction. In its July 2025 update, the company introduced support for MCP inside its database platform. This allows AI agents to pull structured context directly from the database, including things like schema, security settings, recent queries, and how different teams are using the system.
“The Model Context Protocol is emerging as a critical standard for structured communication between LLM-powered agents and external tools,” wrote Supreeth Bare, Senior Software Engineer at Oracle. “By providing a unified way to pass context, MCP unlocks new possibilities for building intelligent, context-aware applications. Its standardized approach is especially useful for agents that need to work across multiple tools and data sources.
“Within Oracle’s GenAI platform, MCP enables agents to retrieve, understand, and act on information with more autonomy and precision.”
Snowflake is one of the more active players experimenting with MCP. The company has released open source tools that let developers set up MCP servers connected to its Cortex services, giving agents a way to retrieve data, run analysis, or summarize documents from within the Snowflake environment. It’s a step toward standardizing how AI agents interact with structured and unstructured data without relying on brittle integrations.
Cortex Agents are critical to these efforts. These agents can handle more than a single query, coordinating across tools and refining results as they go. With MCP acting as the bridge, Snowflake is making it easier for teams to test multi-step AI workflows using their existing data. It’s still early, but it shows how database platforms are beginning to make room for agents that need more than just read access; they need to participate in real tasks.
DataStax is taking a different route into the MCP ecosystem, one that highlights how NoSQL databases can fit into the future of agentic AI. Through its partnership with Claude and tooling like Cursor, the company is giving developers a way to interact with Astra DB using plain language. Instead of writing queries or stitching together APIs, users can now ask an agent to build a table, populate it, or clean it up, and the agent knows what to do.
MCP gives Astra DB a way to express its capabilities in terms that a model can understand and reason with. That means agents aren’t just retrieving data, they’re making choices by using the database as part of a broader task, not just a source. For a company rooted in real-time, distributed systems, this is a quiet but meaningful shift toward making those systems available to intelligent, autonomous workflows.
While companies like Oracle, Snowflake, and DataStax are building MCP into their commercial offerings, the PostgreSQL community is taking a grassroots approach. There’s no official support yet, but developers have begun experimenting with ways to wire Postgres into agent workflows. From GitHub projects to forum discussions, the community is building lightweight adapters that let AI models interact with Postgres using the MCP standard.
Similarly, vector databases like Pinecone aren’t officially part of the MCP ecosystem yet, but they’re circling the edges. These specialized databases are built to help AI models retrieve relevant information efficiently, often powering memory and search behind the scenes.
That mission aligns closely with MCP’s goal of helping agents understand and interact with external tools more intelligently.If MCP continues to gain ground, vector databases could move from being passive data fetchers to becoming active contributors in agent workflows.
Related Items
Vector Databases Emerge to Fill Critical Role in AI
What Benchmarks Say About Agentic AI’s Coding Potential
The Future of AI Agents is Event-Driven