Skip to main content

2025 DataOps Predictions - Part 1

As part of APMdigest's 2025 Predictions Series, industry experts offer predictions on how DataOps and related technologies will evolve and impact business in 2025.

2025: REAL-TIME DATA IS KEY FOR AI

Real-time data will be a key differentiator for competitive advantage: Industries will increasingly rely on real-time or near real-time data to maintain a competitive edge. Companies that can integrate up-to-date data into their AI systems will provide superior customer experiences with fewer issues and more personalized solutions. The ability to capture and analyze data in real-time will separate industry leaders from those who struggle to modernize their data infrastructure.
Ayman Sayed
CEO, BMC Software

Enterprises Will Augment GenAI with Real-Time Data: The true value of GenAI is realized when integrated into enterprise applications at scale. While enterprises have been cautious with trial deployments, 2025 will be a turning point as they begin to scale GenAI across critical systems like customer support, supply chain, manufacturing, and finance. This will require tools to manage data and track GenAI models, ensuring visibility into data usage. GenAI must be supplemented with specific real-time data, such as vectors and graphs, to maximize effectiveness. In 2025, leading vendors will begin rolling out applications that leverage these advancements.
Lenley Hensarling
Technical Advisor, Aerospike

MULTIMODAL DATA

Multimodal data will be very big, extracting corporate value: Back in 2004, Tim O'Reilly coined the phrase, "Data is the Intel Inside." We don't think quite as much about Intel these days, but Tim was absolutely right about data. We became obsessed with data. We've been talking about data science, being data-driven, and building data-driven organizations ever since. Artificial Intelligence is the current expression of the importance of data.

One problem with being data-driven is that most of any organization's data is locked up in ways that aren't useful. Being data-driven works well if you have nicely structured data in a database. Most companies have that, but they're also sitting on a mountain of unstructured data: PDF files, videos, meeting recordings, real-time data feeds, and more. They aren't even used to thinking of this as data; it's not amenable to SQL and database-centric "business intelligence."

That will change in 2025. It will change because AI will give us the ability to unlock this data as well as the ability to analyze it. It will be able to give structure to the information in PDFs, in videos, in meeting transcripts, and in raw data coming in from sensors. In his Generative AI in the Real World interview, Robert Nishihara asked us to think of the video generated by an autonomous vehicle. Most of that is of limited value — but every now and then, there's a traffic situation that is extremely valuable. Humans aren't going to watch hours of video to extract the value; that's a job for AI. Multimodal AI will help companies to unlock the value of data like this. We're at the start of a new generation of tools for data acquisition, cleaning, and curation that will make this unstructured data accessible.
Laura Baldwin
President, O'Reilly Media

AI DRIVES NEW FOCUS ON DATA QUALITY

AI will renew the focus on data quality, for two reasons: First, high quality data is required for training and fine-tuning models. Second, AI-powered analytics tools will offer a higher-resolution view of data, revealing previously undetected quality issues.
Ryan Janssen
CEO, Zenlytic

Enterprises that ready their data for AI will pull ahead competitively: In 2025, companies will focus on building an organized, high-quality data ecosystem to maximize AI's effectiveness and to pull ahead of their competition. This includes managing metadata through structured data catalogs, ensuring data accuracy with rigorous cleansing and validation, and establishing robust governance practices to safeguard data privacy and security. By implementing clear, ethical guidelines, organizations will create a trustworthy AI framework, empowering data scientists with easy access to reliable data for generating precise, impactful insights across business functions. Enterprises that do this will be hard to compete with. 
Scott Voigt
CEO and Founder, Fullstory

AI DRIVES DATA PIPELINE AUTOMATION

GenAI and as-code first technologies drive data pipeline automation: The ubiquitous use of Kubernetes has led to a configuration-first experience in defining data pipelines. It's as simple as selecting a container image and adding configuration. We'll increasingly see GenAI, trained on processing and execution engines generating this configuration and deploying pipelines automatically through just natural language prompts. Traditional visual ETL tooling, even low code platforms are now at risk of disruption. What a power user could do in a few days (remember you still need to learn these platforms), GenAI does in seconds, spitting out configuration for real-time pipelines. This leads to the question. What is the wider future of any UX if my interface is a prompt? Just view data results and metrics? Engineers may as well be going back to a command line!
Andrew Stevenson 
CTO, Lenses.io

AI-ENHANCED DATA MANAGEMENT AND GOVERNANCE

AI is changing how companies manage and govern their data. Organizations now use data lakehouses to support data scientists and AI engineers working with large language models (LLMs). These lakehouses simplify data access, helping teams avoid juggling multiple storage systems. AI is also helping to automate manual processes like data cleaning and reconciliation—a pain point for many professionals. As AI continues to scale, automated governance will allow companies to manage data more effectively with less manual work.
Emmanuel Darras
CEO and Co-Founder, Kestra

UNIFIED DATA ACCESS AND FEDERATION

A unified approach to data access is high on the agenda for enterprises that plan to consolidate analytics data into a single, accessible source. Data lakehouses support this by providing federated access, allowing teams across the organization to tap into the same data without duplicating it. This approach is expected to drive cross-functional analytics and reduce latency, making it easier for teams to work together on the same shared data.
Emmanuel Darras
CEO and Co-Founder, Kestra

TRUST IN DATA

Establishing trust in data will become the top priority for leaders: In the AI era, data is no longer just a byproduct of operations; it's the foundation for resilience and innovation. Without a strong trust in the data that organizations have and use, businesses will continue to struggle to make informed decisions or leverage emerging technologies like AI. Building this trust will go beyond technology and require leaders to boost data literacy and choose a data strategy that emphasizes both capability and quality. 
Daniel Yu
SVP, SAP Data and Analytics

DATA LABELING

Microscopic lens on the source of data labeling: In technical circles, there are constant discussions around how to get the right dataset — and in turn, how to label that dataset. The reality is that this labeling is outsourced on a global scale. In many cases, it's happening internationally, and often in developing countries, with questionable conditions and levels of pay. You may have task-based workers assessing hundreds of thousands of images and being paid for the number accurately sorted. While AI engineers may be highly in demand and paid well above the market rate, there are questions about this subeconomy.
Gordon Van Huizen
SVP of Strategy, Mendix

EXTENSIVE DATA SETS

Retaining Extensive Data Sets Will Become Essential: GenAI depends on a wide range of structured, unstructured, internal, and external data. Its potential relies on a strong data ecosystem that supports training, fine-tuning, and Retrieval-Augmented Generation (RAG). For industry-specific models, organizations must retain large volumes of data over time. As the world changes, relevant data becomes apparent only in hindsight, revealing inefficiencies and opportunities. By retaining historical data and integrating it with real-time insights, businesses can turn AI from an experimental tool into a strategic asset, driving tangible value across the organization.
Lenley Hensarling
Technical Advisor, Aerospike

SMALL DATA

The past few years have seen a rise in data volumes, but 2025 will bring the focus from "big data" to "small data." We're already seeing this mindset shift with large language models giving way to small language models. Organizations are realizing they don't need to bring all their data to solve a problem or complete an initiative — they need to bring the right data. The overwhelming abundance of data, often referred to as the "data swamp," has made it harder to extract meaningful insights. By focusing on more targeted, higher-quality data — or the "data pond" — organizations can ensure data trust and precision. This shift towards smaller, more relevant data will help speed up analysis timelines, get more people using data, and drive greater ROI from data investments.
Francois Ajenstat
Chief Product Officer, Amplitude

Check back tomorrow for more DataOps predictions.

Hot Topics

The Latest

Industry experts offer predictions on how DataOps and related technologies will evolve and impact business in 2025 ...

Gartner highlighted the six trends that will have a significant impact on infrastructure and operations (I&O) for 2025 ...

Since IT costs can consume a significant share of revenue ... enterprises should (but often don't) pay close attention to the efficiency of IT operations at scale. Improving operational cost structures even fractionally can yield major savings for larger organizations, often in the tens of millions of dollars ...

Being able to access the full potential of artificial intelligence (AI) and advanced analytics has become a critical differentiator for businesses. These technologies allow for more informed decision-making, boost operational efficiency, enhance security, and reveal valuable insights hidden within massive data sets. Yet, for organizations to truly harness AI's capabilities, they must first tap into an often-overlooked asset: their mainframe data ...

The global IT skills shortage will persist, and perhaps worsen, over the next few years, carrying a collective price tag of more than $5 trillion. Organizations must search for ways to streamline their IT service management (ITSM) workflows in addition to, or even apart from, hiring more staff. Those who don't find alternative methods of ITSM efficiency will be left behind by their competitors ...

Embedding greater levels of deep learning into enterprise systems demands these deep-learning solutions to be "explainable," conveying to business users why it predicted what it predicted. This "explainability" needs to be communicated in an easy-to-understand and transparent manner to gain the comfort and confidence of users, building trust in the teams using these solutions and driving the adoption of a more responsible approach to development ...

Modern people can't spend a day without smartphones, and businesses have understood this very well! Mobile apps have become an effective channel for reaching customers. However, their distributed nature and delivery networks may cause performance problems ... Performance engineering can be a solution.

Image
Cigniti

Industry experts offer predictions on how Cloud, FinOps and related technologies will evolve and impact business in 2025. Part 3 covers FinOps ...

Industry experts offer predictions on how Cloud, FinOps and related technologies will evolve and impact business in 2025. Part 2 covers repatriation and more ...

Industry experts offer predictions on how Cloud, FinOps and related technologies will evolve and impact business in 2025 ...

2025 DataOps Predictions - Part 1

As part of APMdigest's 2025 Predictions Series, industry experts offer predictions on how DataOps and related technologies will evolve and impact business in 2025.

2025: REAL-TIME DATA IS KEY FOR AI

Real-time data will be a key differentiator for competitive advantage: Industries will increasingly rely on real-time or near real-time data to maintain a competitive edge. Companies that can integrate up-to-date data into their AI systems will provide superior customer experiences with fewer issues and more personalized solutions. The ability to capture and analyze data in real-time will separate industry leaders from those who struggle to modernize their data infrastructure.
Ayman Sayed
CEO, BMC Software

Enterprises Will Augment GenAI with Real-Time Data: The true value of GenAI is realized when integrated into enterprise applications at scale. While enterprises have been cautious with trial deployments, 2025 will be a turning point as they begin to scale GenAI across critical systems like customer support, supply chain, manufacturing, and finance. This will require tools to manage data and track GenAI models, ensuring visibility into data usage. GenAI must be supplemented with specific real-time data, such as vectors and graphs, to maximize effectiveness. In 2025, leading vendors will begin rolling out applications that leverage these advancements.
Lenley Hensarling
Technical Advisor, Aerospike

MULTIMODAL DATA

Multimodal data will be very big, extracting corporate value: Back in 2004, Tim O'Reilly coined the phrase, "Data is the Intel Inside." We don't think quite as much about Intel these days, but Tim was absolutely right about data. We became obsessed with data. We've been talking about data science, being data-driven, and building data-driven organizations ever since. Artificial Intelligence is the current expression of the importance of data.

One problem with being data-driven is that most of any organization's data is locked up in ways that aren't useful. Being data-driven works well if you have nicely structured data in a database. Most companies have that, but they're also sitting on a mountain of unstructured data: PDF files, videos, meeting recordings, real-time data feeds, and more. They aren't even used to thinking of this as data; it's not amenable to SQL and database-centric "business intelligence."

That will change in 2025. It will change because AI will give us the ability to unlock this data as well as the ability to analyze it. It will be able to give structure to the information in PDFs, in videos, in meeting transcripts, and in raw data coming in from sensors. In his Generative AI in the Real World interview, Robert Nishihara asked us to think of the video generated by an autonomous vehicle. Most of that is of limited value — but every now and then, there's a traffic situation that is extremely valuable. Humans aren't going to watch hours of video to extract the value; that's a job for AI. Multimodal AI will help companies to unlock the value of data like this. We're at the start of a new generation of tools for data acquisition, cleaning, and curation that will make this unstructured data accessible.
Laura Baldwin
President, O'Reilly Media

AI DRIVES NEW FOCUS ON DATA QUALITY

AI will renew the focus on data quality, for two reasons: First, high quality data is required for training and fine-tuning models. Second, AI-powered analytics tools will offer a higher-resolution view of data, revealing previously undetected quality issues.
Ryan Janssen
CEO, Zenlytic

Enterprises that ready their data for AI will pull ahead competitively: In 2025, companies will focus on building an organized, high-quality data ecosystem to maximize AI's effectiveness and to pull ahead of their competition. This includes managing metadata through structured data catalogs, ensuring data accuracy with rigorous cleansing and validation, and establishing robust governance practices to safeguard data privacy and security. By implementing clear, ethical guidelines, organizations will create a trustworthy AI framework, empowering data scientists with easy access to reliable data for generating precise, impactful insights across business functions. Enterprises that do this will be hard to compete with. 
Scott Voigt
CEO and Founder, Fullstory

AI DRIVES DATA PIPELINE AUTOMATION

GenAI and as-code first technologies drive data pipeline automation: The ubiquitous use of Kubernetes has led to a configuration-first experience in defining data pipelines. It's as simple as selecting a container image and adding configuration. We'll increasingly see GenAI, trained on processing and execution engines generating this configuration and deploying pipelines automatically through just natural language prompts. Traditional visual ETL tooling, even low code platforms are now at risk of disruption. What a power user could do in a few days (remember you still need to learn these platforms), GenAI does in seconds, spitting out configuration for real-time pipelines. This leads to the question. What is the wider future of any UX if my interface is a prompt? Just view data results and metrics? Engineers may as well be going back to a command line!
Andrew Stevenson 
CTO, Lenses.io

AI-ENHANCED DATA MANAGEMENT AND GOVERNANCE

AI is changing how companies manage and govern their data. Organizations now use data lakehouses to support data scientists and AI engineers working with large language models (LLMs). These lakehouses simplify data access, helping teams avoid juggling multiple storage systems. AI is also helping to automate manual processes like data cleaning and reconciliation—a pain point for many professionals. As AI continues to scale, automated governance will allow companies to manage data more effectively with less manual work.
Emmanuel Darras
CEO and Co-Founder, Kestra

UNIFIED DATA ACCESS AND FEDERATION

A unified approach to data access is high on the agenda for enterprises that plan to consolidate analytics data into a single, accessible source. Data lakehouses support this by providing federated access, allowing teams across the organization to tap into the same data without duplicating it. This approach is expected to drive cross-functional analytics and reduce latency, making it easier for teams to work together on the same shared data.
Emmanuel Darras
CEO and Co-Founder, Kestra

TRUST IN DATA

Establishing trust in data will become the top priority for leaders: In the AI era, data is no longer just a byproduct of operations; it's the foundation for resilience and innovation. Without a strong trust in the data that organizations have and use, businesses will continue to struggle to make informed decisions or leverage emerging technologies like AI. Building this trust will go beyond technology and require leaders to boost data literacy and choose a data strategy that emphasizes both capability and quality. 
Daniel Yu
SVP, SAP Data and Analytics

DATA LABELING

Microscopic lens on the source of data labeling: In technical circles, there are constant discussions around how to get the right dataset — and in turn, how to label that dataset. The reality is that this labeling is outsourced on a global scale. In many cases, it's happening internationally, and often in developing countries, with questionable conditions and levels of pay. You may have task-based workers assessing hundreds of thousands of images and being paid for the number accurately sorted. While AI engineers may be highly in demand and paid well above the market rate, there are questions about this subeconomy.
Gordon Van Huizen
SVP of Strategy, Mendix

EXTENSIVE DATA SETS

Retaining Extensive Data Sets Will Become Essential: GenAI depends on a wide range of structured, unstructured, internal, and external data. Its potential relies on a strong data ecosystem that supports training, fine-tuning, and Retrieval-Augmented Generation (RAG). For industry-specific models, organizations must retain large volumes of data over time. As the world changes, relevant data becomes apparent only in hindsight, revealing inefficiencies and opportunities. By retaining historical data and integrating it with real-time insights, businesses can turn AI from an experimental tool into a strategic asset, driving tangible value across the organization.
Lenley Hensarling
Technical Advisor, Aerospike

SMALL DATA

The past few years have seen a rise in data volumes, but 2025 will bring the focus from "big data" to "small data." We're already seeing this mindset shift with large language models giving way to small language models. Organizations are realizing they don't need to bring all their data to solve a problem or complete an initiative — they need to bring the right data. The overwhelming abundance of data, often referred to as the "data swamp," has made it harder to extract meaningful insights. By focusing on more targeted, higher-quality data — or the "data pond" — organizations can ensure data trust and precision. This shift towards smaller, more relevant data will help speed up analysis timelines, get more people using data, and drive greater ROI from data investments.
Francois Ajenstat
Chief Product Officer, Amplitude

Check back tomorrow for more DataOps predictions.

Hot Topics

The Latest

Industry experts offer predictions on how DataOps and related technologies will evolve and impact business in 2025 ...

Gartner highlighted the six trends that will have a significant impact on infrastructure and operations (I&O) for 2025 ...

Since IT costs can consume a significant share of revenue ... enterprises should (but often don't) pay close attention to the efficiency of IT operations at scale. Improving operational cost structures even fractionally can yield major savings for larger organizations, often in the tens of millions of dollars ...

Being able to access the full potential of artificial intelligence (AI) and advanced analytics has become a critical differentiator for businesses. These technologies allow for more informed decision-making, boost operational efficiency, enhance security, and reveal valuable insights hidden within massive data sets. Yet, for organizations to truly harness AI's capabilities, they must first tap into an often-overlooked asset: their mainframe data ...

The global IT skills shortage will persist, and perhaps worsen, over the next few years, carrying a collective price tag of more than $5 trillion. Organizations must search for ways to streamline their IT service management (ITSM) workflows in addition to, or even apart from, hiring more staff. Those who don't find alternative methods of ITSM efficiency will be left behind by their competitors ...

Embedding greater levels of deep learning into enterprise systems demands these deep-learning solutions to be "explainable," conveying to business users why it predicted what it predicted. This "explainability" needs to be communicated in an easy-to-understand and transparent manner to gain the comfort and confidence of users, building trust in the teams using these solutions and driving the adoption of a more responsible approach to development ...

Modern people can't spend a day without smartphones, and businesses have understood this very well! Mobile apps have become an effective channel for reaching customers. However, their distributed nature and delivery networks may cause performance problems ... Performance engineering can be a solution.

Image
Cigniti

Industry experts offer predictions on how Cloud, FinOps and related technologies will evolve and impact business in 2025. Part 3 covers FinOps ...

Industry experts offer predictions on how Cloud, FinOps and related technologies will evolve and impact business in 2025. Part 2 covers repatriation and more ...

Industry experts offer predictions on how Cloud, FinOps and related technologies will evolve and impact business in 2025 ...