Data Engineers Spend 2 Days Per Week Firefighting Bad Data
September 13, 2022
Share this

Data professionals are spending 40% of their time evaluating or checking data quality and that poor data quality impacts 26% of their companies' revenue, according to The State of Data Quality 2022, a report commissioned by Monte Carlo and conducted by Wakefield Research.

The survey found that 75% of participants take four or more hours to detect a data quality incident and about half said it takes an average of nine hours to resolve the issue once identified. Worse, 58% said the total number of incidents has increased somewhat or greatly over the past year, often as a result of more complex pipelines, bigger data teams, greater volumes of data, and other factors.

Today, the average organization experiences about 61 data-related incidents per month, each of which takes an average of 13 hours to identify and resolve. This adds up to an average of about 793 hours per month, per company.

However, 61 incidents only represents the number of incidents known to respondents.

"In the mid-2010s, organizations were shocked to learn that their data scientists were spending about 60% of their time just getting data ready for analysis," said Barr Moses, Monte Carlo CEO and co-founder. "Now, even with more mature data organizations and advanced stacks, data teams are still wasting 40% of their time troubleshooting data downtime. Not only is this wasting valuable engineering time, but it's also costing precious revenue and diverting attention away from initiatives that move the needle for the business. These results validate that data reliability is one of the biggest and most urgent problems facing today's data and analytics leaders."

Nearly half of respondent organizations measure data quality most often by the number of customer complaints their company receives, highlighting the ad hoc - and reputation damaging - nature of this important element of modern data strategy.

The Cost of Data Downtime

"Garbage in, garbage out" aptly describes the impact data quality has on data analytics and machine learning. If the data is unreliable, so are the insights derived from it.

In fact, on average, respondents said bad data impacts 26% of their revenue. This validates and supplements other industry studies that have uncovered the high cost of bad data. For example, Gartner estimates poor data quality costs organizations an average $12.9 million every year.

Nearly half said business stakeholders are impacted by issues the data team doesn't catch most of the time, or all the time.

In fact, according to the survey, respondents that conducted at least three different types of data tests for distribution, schema, volume, null or freshness anomalies at least once a week suffered fewer data incidents (46) on average than respondents with a less rigorous testing regime (61). However, testing alone was insufficient and stronger testing did not have a significant correlation with reducing the level of impact on revenue or stakeholders.

"Testing helps reduce data incidents, but no human being is capable of anticipating and writing a test for every way data pipelines can break. And if they could, it wouldn't be possible to scale across their always changing environment," said Lior Gavish, Monte Carlo CTO and co-founder. "Machine learning-powered anomaly monitoring and alerting through data observability can help teams close these coverage gaps and save data engineers' time."

Share this

The Latest

September 05, 2024

The edge brings computing resources and data storage closer to end users, which explains the rapid boom in edge computing, but it also generates a huge amount of data ... 44% of organizations are investing in edge IT to create new customer experiences and improve engagement. To achieve those goals, edge services observability should be a centerpoint of that investment ...

September 04, 2024

The growing adoption of efficiency-boosting technologies like artificial intelligence (AI) and machine learning (ML) helps counteract staffing shortages, rising labor costs, and talent gaps, while giving employees more time to focus on strategic projects. This trend is especially evident in the government contracting sector, where, according to Deltek's 2024 Clarity Report, 34% of GovCon leaders rank AI and ML in their top three technology investment priorities for 2024, above perennial focus areas like cybersecurity, data management and integration, business automation and cloud infrastructure ...

September 03, 2024

While IT leaders are preparing organizations for accelerated generative AI (GenAI) adoption, C-suite executives' confidence in their IT team's ability to deliver basic services is declining, according to a study conducted by the IBM Institute for Business Value ...

August 29, 2024

The consequences of outages have become a pressing issue as the largest IT outage in history continues to rock the world with severe ramifications ... According to the Catchpoint Internet Resilience Report, these types of disruptions, internet outages in particular, can have severe financial and reputational impacts and enterprises should strongly consider their resilience ...

August 28, 2024

Everyday AI and digital employee experience (DEX) are projected to reach mainstream adoption in less than two years according to the Gartner, Inc. Hype Cycle for Digital Workplace Applications, 2024 ...

August 27, 2024

When an IT issue is not handled correctly, not only is innovation stifled, but stakeholder trust can also be impacted (such as when there's an IT outage or slowdowns in performance). When you add new technology investments and innovations into the mix, you have a recipe for disaster ...

August 26, 2024

To get a better understanding of the top issues facing IT teams in financial services, Auvik recently released its 2024 Financial Services IT Trends Report ... Not surprisingly, the experience of FinServ IT teams is significantly impacted by the onslaught of cyberattacks facing financial services organizations as well as the complex regulatory environment of this industry ...

August 22, 2024

The CrowdStrike outage serves as a potent illustration of the risks associated with complex security environments. Enterprises are increasingly advised to consider simpler, more robust solutions that do not rely heavily on reactive security measures ...

August 21, 2024

When IT leaders started telling Enterprise Management Associates (EMA™) more than a year ago that their personnel were using premium ChatGPT subscriptions to create device configs and automation scripts, we knew the industry was on the verge of a revolution ...

August 20, 2024

The rapid rise of creative "right-brain" generative AI (GenAI) has opened the door to greater adoption of the more analytical "left-brain" AI decisioning solutions by global businesses, according to new research from Pegasystems ...