Follow BigDATAwire:

July 24, 2025

Why Supply Chain Resilience Starts with a Common Data Language

Melanie Nuce-Hilton

(Tasnim911/Shutterstock)

U.S.-China tariffs may be on pause, but the pressure on supply chains isn’t going anywhere. For supply chain and procurement professionals, the result is continued uncertainty. Teams are having to reexamine sourcing strategies in real time, yet many companies still can’t answer basic questions like where their components are sourced or which SKUs are most exposed.

It’s not just geopolitical shifts creating chaos, it’s the data powering supply chains that’s falling short. New research shows that nearly half of supply chain leaders still lack full visibility into their supply chain networks. Without a clean and connected foundation based on quality data, companies can’t quickly pivot sourcing strategies, respond to shifting trade policies, or assess exposure to risk in real time. Regulatory compliance becomes harder to maintain, and advanced technologies like AI are rendered less effective, amplifying inadequate inputs instead of unlocking real insight.

What’s obscuring supply chains today isn’t always a lack of effort–it’s a lack of clarity, driven by visibility challenges rooted in bad data.

Fragmented Data, Missed Opportunities

Despite increasing digitalization efforts, many supply chains are still stitched together with spreadsheets, emails, and siloed systems. When each node operates independently, whether it’s the factory, warehouse, or distribution center, blind spots are created that lead to bottlenecks that make it harder to act when disruptions occur.

Consider a food and beverage manufacturer using three separate systems: one for tracking ingredients, another for shipments, and another for supplier information. If these platforms don’t talk to each other, a delay recorded in one may not be reflected in the others—leading to missed deadlines, customs delays, or inaccuracies in sourcing information. These errors don’t just slow things down, they create real financial and reputational risk.

IPC relies on data standards to keep 23,000 Subway stores well-stocked (Mahmoud Suhail/Shutterstock)

In contrast, IPC, the independent purchasing cooperative that manages the supply chain for approximately 23,000 Subway restaurants. demonstrates the power of standardized data in action. As the central organization responsible for sourcing, pricing, food safety, inventory, and logistics, IPC must coordinate with thousands of suppliers and distributors to keep Subway’s operations running smoothly. To reduce inefficiencies and stay ahead of evolving regulations, IPC implemented global data standards that uniquely identify every product and location across its network.

This end-to-end visibility strengthened traceability, improved inventory management, and streamlined food safety practices, resulting in measurable benefits, including $1.3 million in annual cost avoidance through more efficient planning. For a supply chain of that scale and complexity, standardization wasn’t just helpful, it was transformative.

Disruptions, whether driven by extreme weather, labor shortages, or policy changes, will continue to test supply chains. And yet, only 45% of companies today can provide meaningful transparency into where materials come from, while another 44% lack centralized data management altogether. Without a common foundation, organizations aren’t just reacting slowly: they’re flying blind.

Standardization Solves the Data Problem

Building resilience isn’t just about buying more tech, it’s about making data more trustworthy, shareable, and actionable. That’s where global data standards play a critical role.

The most agile supply chains are built on a shared framework for identifying, capturing, and sharing data. When organizations use consistent product and location identifiers, such as GTINs (Global Trade Item Numbers) and GLNs (Global Location Numbers) respectively, they reduce ambiguity, improve traceability, and eliminate the need for manual data reconciliation. With a common data language in place, businesses can cut through the noise of siloed systems and make faster, more confident decisions.

(DG-Studio/Shutterstock)

Companies further along in their digital transformation can also explore advanced data-sharing standards like EPCIS (Electronic Product Code Information Services) or RFID (radio frequency identification) tagging, particularly in high-volume or high-risk environments. These technologies offer even greater visibility at the item level, enhancing traceability and automation.

And the benefits of this kind of visibility extend far beyond trade compliance. Companies that adopt global data standards are significantly more agile. In fact, 58% of companies with full standards adoption say they manage supply chain agility “very well” compared to just 14% among those with no plans to adopt standards, studies show.

Standardized data also powers more effective use of downstream tools like AI. From predictive forecasting to anomaly detection, automated auditing, and content aggregation and generation based on complex knowledge graphs, these systems are only as effective as the data they’re built on. Without a standardized, structured foundation, AI can amplify bad inputs and produce unreliable insights. But with high-quality, standards-based data, organizations can unlock the full value of their technology investments — making smarter, faster decisions with greater confidence.

How to Build a Standards-Based Data Strategy

Moving from fragmentation to clarity doesn’t require a full overhaul. It starts with a few practical steps:

Map your blind spots: Visibility begins with understanding what data you have, and what’s missing. Identify where sourcing, product, and location data lives, who owns it, and how it’s managed. A cross-functional audit can uncover inconsistencies and friction points.

(metamorworks/Shutterstock)

Standardize at the source: Use GTINs to uniquely identify specific products and GLNs to pinpoint locations throughout the supply chain.  These foundational standards eliminate ambiguity and create a reliable foundation for downstream systems.

Centralize and connect your data: Integrate key systems like procurement, logistics, and compliance, on a common platform. Companies using centralized, standards-based platforms are far more confident in trusting their data accuracy.

Collaborate with your trading partners: The strongest supply chains don’t just digitize data—they share it. Aligning with trading partners to share quality data helps companies act with increased agility, visibility and confidence when disruptions occur. Start by agreeing on shared standards like product identifiers and data formats, then work together to integrate those into each partner’s systems and workflows.

From Data Chaos to Supply Chain Clarity

You can’t reroute what you can’t see. In a global economy defined by constant disruption, visibility is no longer optional, it’s a competitive advantage.

Tariffs, regulations, and extreme events may be outside your control, but your data isn’t. By adopting global standards, companies can move from reactive problem-solving to proactive planning. They gain the tools to make faster decisions, reduce operational risk, and unlock the full potential of supply chain innovation.

In the end, resilience doesn’t start with more tech. It starts with better data.

About the author: Melanie Nuce-Hilton is the Senior Vice President of Customer Success at GS1 US, the non-profit international organization dedicated to furthering data standards for UPC barcodes, RFID tags, GLNs, GTINs, and EPCIS.  

Related Items:

Why the Future of Retail is AI Everywhere

Artificial Intelligence in AP Automation – A Look at What Really Works, and What Doesn’t

The Trade War, Supply Chain Risk, and AI

BigDATAwire