
DataBank Reports 60% of Enterprises Already Seeing AI ROI or Expect to Within 12 Months
DALLAS, Aug. 14, 2025 — DataBank, a leading provider of enterprise-class colocation, connectivity, and managed services, today released a new report based on sponsored research showing enterprises are rapidly adopting hybrid infrastructure approaches and geographic distribution strategies as AI moves from pilot projects to business-critical applications.
The report, “Accelerating AI: Navigating the Future of Enterprise Infrastructure,” documents how organizations are evolving from generic AI applications to customized models requiring fundamentally different infrastructure strategies.
“The data shows a clear evolution: While most enterprises start AI initiatives in the cloud, they’re quickly adopting hybrid approaches that combine cloud, on-premises, and colocation for different workloads,” said DataBank’s CEO Raul Martynek. “Success requires infrastructure flexibility that can support both centralized training and distributed inference while meeting security and compliance requirements.”
Key Research Findings
The study identified five critical trends shaping enterprise AI adoption:
- Enterprises are achieving meaningful AI returns. Twenty-five percent report consistent annual ROI, with another 35% expecting returns within the next year. Organizations are shifting from quick wins to transformational capabilities that enable entirely new business functions.
- Integration challenges are replacing data quality concerns. Only 20% cite poor data quality as a major obstacle. Instead, the primary barriers now include integration challenges, scaling difficulties, and talent shortages.
- Hybrid infrastructure is becoming the standard. While 64% of enterprises start AI initiatives in public or private cloud, organizations are increasingly combining cloud services with on-premises and colocation infrastructure for sensitive workloads.
- Companies are expanding AI infrastructure geographically. Seventy-six percent plan to expand AI infrastructure closer to data sources and end-users for latency reduction and compliance. AI training is centralizing while inference workloads are distributing globally.
- AI strategies are growing more sophisticated. Enterprises are moving from generic third-party AI and large language models (LLMs) toward customized or proprietary models. Infrastructure approaches are blending off-the-shelf applications, custom solutions, and tailored deployment models.
Infrastructure Implications
The research highlights significant implications for AI infrastructure planning. Organizations need flexible, hybrid approaches that can support both centralized AI training and geographically distributed inference workloads.
Key considerations include security and compliance requirements for sensitive data, latency optimization for real-time applications, and scalable architectures that can adapt as AI strategies evolve from generic tools to customized, proprietary models. The data also shows that data maturity is not the adoption bottleneck, as expected. Rather, talent gaps are now key barriers.
The complete research report, “Accelerating AI: Navigating the Future of Enterprise Infrastructure,” including detailed findings and recommendations for AI infrastructure planning, is available for download.
Source: DataBank