As organizations strive to capitalize on their ever-growing data trove to scale their operations and improve business outcomes, only 17% of data ingested or landed consists of emergent data types, and only 9% of that data is processed or analyzed, according to a new report from BMC, Putting the "Ops" in DataOps: Success factors for operationalizing data.
This signals a significant opportunity to benefit from emergent data types critical for initiatives like generative AI, LLMs, FinOps, and sustainability.
The study defined four maturity levels, including:
■ Developing – discovery phase with strategies in their infancy, and practices and architecture not closely aligned to business outcomes.
■ Functional – growth phase with strategies primarily developed and some high-priority practices and architecture linked to business outcomes.
■ Proficient – adolescent phase representing a fully established strategy with nearly all practices and architecture linked to critical business outcomes.
■ Exceptional – innovation phase with a perpetually optimized strategy, practices, and architecture that generates competitive differentiation and business value.
DataOps strategy is closely aligned with data management maturity. Of those respondents with exceptional data management maturity, 27% stated they use DataOps methodologies across their organization to support all data-driven activities. In comparison, those with proficient maturity levels reported 19%, and functional and developing levels stated 15% and 10%, respectively. Even among organizations with exceptional data maturity, only 41% report having "high maturity" for data pipeline and application workflow orchestration functions.
Higher data management and DataOps maturity are linked to higher reported adoption and success with data-driven activities. 75% of those with mature practices have a Chief Data Officer, while only 54% with less mature practices do.
Challenges Obstruct Flow of Data
Multiple challenges continue to impact the flow of data in businesses, including those related to people, processes, and technology. These include a lack of skills (48%), human error and mistakes (43%), limitations on scalability (40%), and a lack of technology automation (43%). A lack of automation can exacerbate a lack of skills, while an appropriate use of automation can amplify skills already available.
"AI and data are in a cosmic dance, and data challenges are increasing dramatically in the AI era," said Ram Chakravarti, chief technology officer at BMC. "This study highlights how organizations with mature data practices can achieve better business outcomes. Implementing DataOps methodologies to enhance collaboration and operational efficiency, maintaining high data quality through pragmatic investments, and developing robust data pipeline orchestration systems can help unlock value at scale."
Methodology: BMC commissioned 451 Research, part of S&P Global Market Intelligence, to conduct the survey in late 2023, sourcing insights from 1,100 IT, data, and business professionals from large enterprises in diverse global regions across multiple industries in eleven countries.
The Latest
Half of all employees are using Shadow AI (i.e. non-company issued AI tools), according to a new report by Software AG ...
On their digital transformation journey, companies are migrating more workloads to the cloud, which can incur higher costs during the process due to the higher volume of cloud resources needed ... Here are four critical components of a cloud governance framework that can help keep cloud costs under control ...
Operational resilience is an organization's ability to predict, respond to, and prevent unplanned work to drive reliable customer experiences and protect revenue. This doesn't just apply to downtime; it also covers service degradation due to latency or other factors. But make no mistake — when things go sideways, the bottom line and the customer are impacted ...
Organizations continue to struggle to generate business value with AI. Despite increased investments in AI, only 34% of AI professionals feel fully equipped with the tools necessary to meet their organization's AI goals, according to The Unmet AI Needs Surveywas conducted by DataRobot ...
High-business-impact outages are costly, and a fast MTTx (mean-time-to-detect (MTTD) and mean-time-to-resolve (MTTR)) is crucial, with 62% of businesses reporting a loss of at least $1 million per hour of downtime ...
Organizations recognize the benefits of generative AI (GenAI) yet need help to implement the infrastructure necessary to deploy it, according to The Future of AI in IT Operations: Benefits and Challenges, a new report commissioned by ScienceLogic ...
Splunk's latest research reveals that companies embracing observability aren't just keeping up, they're pulling ahead. Whether it's unlocking advantages across their digital infrastructure, achieving deeper understanding of their IT environments or uncovering faster insights, organizations are slashing through resolution times like never before ...
A majority of IT workers surveyed (79%) believe the current service desk model will be unrecognizable within three years, with nearly as many (77%) saying new technologies will render it "redundant" by 2027, according to The Death (and Rebirth) of the Service Desk from Nexthink ...
Monitoring your cloud infrastructure on Microsoft Azure is crucial for maintaining its optimal functioning ... In this blog, we will discuss the key aspects you need to consider when selecting the right Azure monitoring software for your business ...
All eyes are on the value AI can provide to enterprises. Whether it's simplifying the lives of developers, more accurately forecasting business decisions, or empowering teams to do more with less, AI has already become deeply integrated into businesses. However, it's still early to evaluate its impact using traditional methods. Here's how engineering and IT leaders can make educated decisions despite the ambiguity ...