

(kurhan/Shutterstock)
The concept of “observability” is well understood as it pertains to DevOps and site reliability engineering (SRE). But what does it mean in the context of data? According to Barr Moses, the CEO of data observability startup Monte Carlo, it’s all about being able to trust the data.
In the last decade or so, we’ve gotten really good at aggregating, storing, collecting, and analyzing data, Moses says. We’re able to move vast amounts of data from sources into data warehouses and data lakes, and utilize the data for dashboards and machine learning models.
“But whether we can actually trust the data is something we haven’t figured out yet,” Moses tells Datanami. “In companies, small to large, really the worst thing that can happen is when you start using the data, but it actually can’t be trusted.”
In fact, making decisions based on bad data can actively hurt a company. Considering all of the ways that humans and algorithms are consuming data in business today, using bad data is worse than doing nothing at all.
There are many ways that data can go bad. From data-entry or coding errors to malfunctioning sensors and data drift, the sources of data contamination are numerous. We can’t guarantee that data will never go bad, so the next best thing is to detect the bad data as soon as possible.
That’s the idea behind data observability. According to Moses, data observability borrows well-established observability concepts from DevOps and SRE in one key respect: the importance of monitoring the outputs of data systems to determine if something is going awry inside the system.
“If you think what DevOps or software engineer teams are tasked with, they have many applications and system infrastructure that they are tasked with making sure that they are up and running at all times,” Moses says. “It’s a very well-understood approach in DevOps, but it’s completely new to data. That’s what we call data observability.”
Monte Carlo’s data observability offering is structured around five pillars of observability:
- Freshness, or the timeliness of the data;
- Volume, of the completeness of the data;
- Distribution, which measures the consistency of data at the field level;
- Schema, relating to the structure of fields and tables;
- Lineage, or a change-log of the data.
“The only way to truly solve data observability is to do that in an end-to-end way,” Moses says. “So it includes the customers’ entire data stack. That includes cloud data lakes, data warehouse, ETL, BI, and machine learning models.”
The company, which has raised $41 million in venture capital to date, built connectors that pull data from each component of the stack, and monitors the data in a read-only manner. If the output begins to show signs of a problem – such as a dashboard reporting null values – the software will automatically generate an alert and send it via email, text, Slack or PagerDuty.
It’s about avoiding data downtime. Just as DevOps and SRE teams have instrumented their systems in an attempt to detect the faintest hint of a pending failure that could take production systems offline, data observability, as Monte Carlo practices it, takes a holistic approach to monitoring a range of data characteristics to figure out when good data is breaking bad.
Moses draws a parallel between what Monte Carlo is setting out to do and what New Relic, DataDog, and App Dynamics are doing in the DevOps and SRE space. While data infrastructure providers (i.e. vendors building databases, data lakes, data warehouses and ETL, BI, and ML tools) supply some monitoring capabilities for their products, it’s an industry best practice to tap third-party observability and monitoring tools to bring all of those components together into a comprehensive view.
“I think it’s important to have a new layer in the modern data stack,” says Moses, who was a management consultant for Bain & Company before founding Monte Carlo with Lior Gavish in 2019. “And part of the new layer is data observability. It’s important for it to be a third-party that can be integrated with all of these vendors and solutions…and also provide a third-party objective view” that can be trusted.
Today, the San Francisco company announced a partnership with cloud data warehouse provider Snowflake. The deal will see Monte Carlo become a “native” provider of data observability for Snowflake customers.
“Data is only powerful if you can actually trust it,” Moses says. “We firmly belief that’s a key part to building a strong data platform and data architecture. The Snowflake team has been really supportive of our vision and this new category of data observability. We’re exited to partner with them to bring this to the our customers.”
Related Items:
Do You Have Customer Data You Can Trust?
September 11, 2025
- MinIO Brings Hyperscaler Economics On-Prem with AIStor Pods
- Honeycomb Introduces the Developer Interface of the Future with AI-Native Observability Suite
- AdaParse: Smart PDF Processing for Scientific AI Training
September 10, 2025
- Progress Software Launches SaaS RAG Platform for Verifiable Generative AI
- Sigma Reveals New AI, BI, and Analytics Features, Redefining Data Exploration Capabilities for Customers
- Couchbase Shareholders Approve Acquisition by Haveli Investments
- Plotly Launches Studio and Cloud with GA as Vibe Analytics Event Approaches
- Expert.ai Launches Enhanced Solutions for Digital Information Services
- ThoughtSpot Redefines Analytics with Boundaryless, Agentic Intelligence
- Perforce Expands AI Capabilities to Boost Speed and Security in Software Development
- DiffusionData Releases Diffusion 6.12
September 9, 2025
- Algolia Unlocks Clean, Contextual Data at Scale with Introduction of Intelligent Data Kit
- CTERA Announces IntelliVerse 2025: A Free Virtual Forum on Data Readiness and AI in Digital Transformation
- Pliops Showcases XDP LightningAI’s Proven Impact at AI Infra Summit 2025
- MLCommons Releases New MLPerf Inference v5.1 Benchmark Results
- Monte Carlo Launches Agent Observability to Help Teams Build Reliable AI
- Sphinx Launches with $9.5M to Redefine How AI Works with Data
- NetApp Modernizes Object Storage with Enhanced Speed, Scalability and Security
- Sourcetable Launches Superagents to Bring Autonomous AI Into the Spreadsheet
- CoreWeave Launches Ventures Group to Invest in Future of AI
- Inside Sibyl, Google’s Massively Parallel Machine Learning Platform
- What Are Reasoning Models and Why You Should Care
- Rethinking Risk: The Role of Selective Retrieval in Data Lake Strategies
- Beyond Words: Battle for Semantic Layer Supremacy Heats Up
- Software-Defined Storage: Your Hidden Superpower for AI, Data Modernization Success
- The AI Beatings Will Continue Until Data Improves
- Why Metadata Is the New Interface Between IT and AI
- Top-Down or Bottom-Up Data Model Design: Which is Best?
- How to Make Data Work for What’s Next
- How Much Docker Should a Data Scientist Know?
- More Features…
- Mathematica Helps Crack Zodiac Killer’s Code
- GigaOm Rates the Object Stores
- Solidigm Celebrates World’s Largest SSD with ‘122 Day’
- Promethium Wants to Make Self Service Data Work at AI Scale
- Databricks Now Worth $100B. Will It Reach $1T?
- AI Hype Cycle: Gartner Charts the Rise of Agents, ModelOps, Synthetic Data, and AI Engineering
- Career Notes for August 2025
- MIT Report Flags 95% GenAI Failure Rate, But Critics Say It Oversimplifies
- Anaconda Report Links AI Slowdown to Gaps in Data Governance
- Data Prep Still Dominates Data Scientists’ Time, Survey Finds
- More News In Brief…
- Seagate Unveils IronWolf Pro 24TB Hard Drive for SMBs and Enterprises
- Gartner Predicts 40% of Generative AI Solutions Will Be Multimodal By 2027
- DataSnap Expands with AI-Enabled Embedded Analytics to Accelerate Growth for Modern Businesses
- Acceldata Announces General Availability of Agentic Data Management
- Hitachi Vantara Recognized by GigaOm, Adds S3 Table Functionality to Virtual Storage Platform One Object
- Transcend Expands ‘Do Not Train’ and Deep Deletion to Power Responsible AI at Scale for B2B AI Companies
- Pecan AI Brings Explainable AI Forecasting Directly to Business Teams
- SETI Institute Awards Davie Postdoctoral Fellowship for AI/ML-Driven Exoplanet Discovery
- NVIDIA: Industry Leaders Transform Enterprise Data Centers for the AI Era with RTX PRO Servers
- Ataccama Data Trust Assessment Reveals Data Quality Gaps Blocking AI and Compliance
- More This Just In…