

(lassedesignen/Shutterstock)
As data continues to pile up, enterprises that maintain flexible approaches to managing and mining that data are the ones most likely to achieve competitive success, according to Gartner, which recently released its top 10 analytics technologies and trends for 2019.
The Global Datashere currently measures 33 zettabytes, according to a recent IDC report, and is predicted to grow to 175 zettabytes by 2025. Navigating this data deluge is no simple matter, as the volume and velocity exceeds the capabilities of existing data analytics rigs running atop legacy architectures.
“The size, complexity, distributed nature of data, speed of action, and the continuous intelligence required by digital business means that rigid and centralized architectures and tools break down,” explains Donald Feinberg, vice president and distinguished analyst at Gartner. “The continued survival of any business will depend upon an agile, data-centric architecture that responds to the constant rate of change.”
So, just what composes an “agile, data-centric architecture”? That, of course, is the $64,000 question. Nobody knows for sure, of course, but Feinberg and other Gartner analysts took a gander at the topic, and shared what they believe to be the top 10 data analytics and technology trends that will be making headlines in 2019. (Spoiler alert: Blockchain hype appears to be fading fast.)
Top 10 Tech
The number one analytics tech trend on Gartner’s list is augmented analytics, which the analyst firm describes as the use of machine learning and AI to transform how analytics content is developed, consumed, and shared.
“By 2020, augmented analytics will be a dominant driver of new purchases of analytics and BI, as well as data science and ML platforms, and of embedded analytics,” Gartner writes. “Data and analytics leaders should plan to adopt augmented analytics as platform capabilities mature.”
The augmented theme continued with number two on the list: augmented data management. Gartner says that ML and AI technologies are impacting how enterprises manage data quality, integration, metadata, and master data.
“It is automating many of the manual tasks and allows less technically skilled users to be more autonomous using data,” the company writes. “It also allows highly skilled technical resources to focus on higher value tasks.
Continuous intelligence is the third major trend. Also known as real-time analytics, this trend encompasses all activities related to harnessing “real-time context data” to improve decision making.
AI has an explainabilty problem, as we’ve documented in these virtual pages on more than one occasion. That’s why breaking the “black box” nature of complex ML and deep learning models is so critical, and why Gartner made it number four on its list.
Graph analytics isn’t new, but the technologies and techniques behind graph are very well aligned to solving the big data challenges enterprises face today and in the future. Thanks to graph analytics’ capability to allow you to ask “complex questions across complex data,” Gartner sees graph growing at a healthy 100% CAGR clip through 2022.
Number six on Gartner’s list is big data fabrics, which represent an emerging way to establish consistency across diverse and distributed data environments. However, the static nature of today’s bespoke data fabric architectures will necessarily give way to more dynamic approaches, which will necessitate redesigns, the analyst firm predicts.
Like chatbots? So does Gartner, which sees big things for the future of natural language processing and conversational interfaces. Fueled by big data collections and neural networking advances, Gartner says 50% of analytical queries will run through a NLP, voice, or search interface by 2020.
Data scientists predominantly conduct their ML and AI work via open source software platforms today. But by 2022, 75% of that work will be done using commercial solutions, predicts the Gartner.
Okay, blockchain is still on Gartner’s radar, thanks to the “significant ramifications” for analytics use cases. But at number nine of the 10 most impactful analytics technologies and trends, it’s fair to say that Gartner isn’t too bullish on blockchain’s short term impact.
Rounding out the top 10 is persistent memory servers, which Gartner defines as “representing a new memory tier between DRAM and NAND flash memory that can provide cost-effective mass memory for high-performance workloads.”
We have been watching the capabilities of in-memory databases and in-memory data grids (IMDGs) advance in the past few years. With more data than ever to process, enterprises are welcoming the bigger memory and storage tiers, to go along with today’s speed processors.
Keeping up with technology trends is not easy in the analytics world. It wasn’t long ago that analysts were praising the idea of big centralized clusters (hello, Hadoop?) that could house all of an enterprise’s data. But today, enterprises are looking at bringing a much more diverse and distributed set of tools and technologies to bear on the ever-growing morass of data that sits before them.
“The story of data and analytics keeps evolving, from supporting internal decision making to continuous intelligence, information products and appointing chief data officers,” Rita Sallam, research vice president at Gartner, said during the Gartner Data & Analytics Summit last week in Sydney, Australia. “It’s critical to gain a deeper understanding of the technology trends fueling that evolving story and prioritize them based on business value.”
Related Items:
Data Growth Rate in U.S. Predicted to Slow
What Gartner Sees In Analytic Hubs
Gartner Sees AI Democratized in Latest ‘Hype Cycle’
July 30, 2025
- Elastic Announces Faster Filtered Vector Search with ACORN-1 and Default Better Binary Quantization Compression
- Nutanix Named a Leader in Multicloud Container Platforms Evaluation
- RAVEL Expands Orchestrate AI With Supermicro-Based AI Workload Solution
- MLCommons Releases MLPerf Client v1.0: A New Standard for AI PC and Client LLM Benchmarking
- IBM: 13% of Organizations Reported Breaches of AI Models, 97% of Which Reported Lacking Proper AI Access Controls
- Hitachi Vantara Announces Virtual Storage Platform One for Hybrid Cloud Data Management
- Elastic Delivers New ES|QL Features for Cross-Cluster Scale, Data Enrichment, and Performance
- Cognizant Launches AI Training Data Services to Accelerate AI Model Development at Enterprise Scale
- Fractal Launches Agentic AI Platform Cogentiq to Drive Enterprise Performance
July 29, 2025
- Git-for-data Pioneer lakeFS Secures $20M in Growth Capital, Fills a Critical Gap in Enterprise AI Tech Stack
- Esri, Microsoft, and Space42 Partner to Launch ‘Map Africa Initiative’
- Teradata Expands ModelOps in ClearScape Analytics for Generative and Agentic AI
- Linux Foundation Welcomes AGNTCY to Tackle AI Agent Fragmentation
- Deloitte: Trust Emerges as Main Barrier to Agentic AI Adoption in Finance and Accounting
- Lightbits Launches NVMe over TCP Storage for Kubernetes on Supermicro Systems, Unveiling Benchmark Results
- AWS and dbt Labs Sign Strategic Collaboration Agreement
- Actian Study Finds Organizations Overestimate Data Governance Maturity, Posing Risk to AI Investments
- Privacera Named Leader in GigaOm Radar for Data Access Governance for 4th Consecutive Time
July 28, 2025
- Scaling the Knowledge Graph Behind Wikipedia
- LinkedIn Introduces Northguard, Its Replacement for Kafka
- Top 10 Big Data Technologies to Watch in the Second Half of 2025
- What Are Reasoning Models and Why You Should Care
- Iceberg Ahead! The Backbone of Modern Data Lakes
- Apache Sedona: Putting the ‘Where’ In Big Data
- Top-Down or Bottom-Up Data Model Design: Which is Best?
- Rethinking AI-Ready Data with Semantic Layers
- What Is MosaicML, and Why Is Databricks Buying It For $1.3B?
- Rethinking Risk: The Role of Selective Retrieval in Data Lake Strategies
- More Features…
- Supabase’s $200M Raise Signals Big Ambitions
- Mathematica Helps Crack Zodiac Killer’s Code
- Solidigm Celebrates World’s Largest SSD with ‘122 Day’
- Promethium Wants to Make Self Service Data Work at AI Scale
- AI Is Making Us Dumber, MIT Researchers Find
- The Top Five Data Labeling Firms According to Everest Group
- Toloka Expands Data Labeling Service
- With $20M in Seed Funding, Datafy Advances Autonomous Cloud Storage Optimization
- Ryft Raises $8M to Help Companies Manage Their Own Data Without Relying on Vendors
- AWS Launches S3 Vectors
- More News In Brief…
- Seagate Unveils IronWolf Pro 24TB Hard Drive for SMBs and Enterprises
- Gartner Predicts 40% of Generative AI Solutions Will Be Multimodal By 2027
- OpenText Launches Cloud Editions 25.3 with AI, Cloud, and Cybersecurity Enhancements
- TigerGraph Secures Strategic Investment to Advance Enterprise AI and Graph Analytics
- Promethium Introduces 1st Agentic Platform Purpose-Built to Deliver Self-Service Data at AI Scale
- StarTree Adds Real-Time Iceberg Support for AI and Customer Apps
- Databricks Announces Data Intelligence Platform for Communications
- Gathr.ai Unveils Data Warehouse Intelligence
- Graphwise Launches GraphDB 11 to Bridge LLMs and Enterprise Knowledge Graphs
- Campfire Raises $35 Million Series A Led by Accel to Build the Next-Generation AI-Driven ERP
- More This Just In…