Back in 2006, British mathematician Clive Humby stated that data was the new oil. Like oil, data isn’t useful in its raw state and must be refined, processed, and distributed to deliver value. Nearly 20 years later, business practices that have grown up around in this analogy include dataops automations used to integrate data; data governance to ensure accuracy, compliance, and usability; and data security to protect data from threats and breaches.

Executives and business leaders understand the importance of these three functions, especially as many organizations are adding genAI products to their digital transformation strategies and establishing AI governance as an essential guardrail.

I was recently asked how data and AI leaders should measure the effectiveness of their dataops, data governance, and data security practices. There is a growing perception that businesses are pouring money into AI with no clear metrics to determine whether the investment delivers business value and reduces risks.

Business leaders have many options to choose from, so I asked leaders to share what really works in practice. What metrics best reveal the value and effectiveness of dataops, data governance, and data security implementations?

Business value metrics

If you want business leaders to value investments in dataops, governance, and security, start with metrics that demonstrate the business value of reliable and timely data.

“Demonstrating business value requires CIOs to utilize KPIs that tie directly to the mission rather than traditional IT metrics that rarely resonate in the boardroom,” says Yakir Golan, CEO and co-founder of Kovrr. “Quantifying the benefits of technological initiatives, for instance, in financial terms, such as cost savings from automation or reduced risk exposure, effectively transforms the conversation into one that is more tangible at the executive level. For example, minimizing forecasted risk exposure by $2 million is much more powerful than IT ticket resolution rates.”

“Data effectiveness metrics CIOs can use include data ROI, similar to marketing attribution,” says Srujan Akula, CEO of The Modern Data Company. “Calculate specific data processing and storage costs against business value delivered and the time-to-insight.”

Measuring data ROI

Cost savings, risk reduction, and ROI tell an important story to business leaders about investment value. They are useful portfolio-level metrics when reviewing the aggregate of initiatives and platforms, but they may be difficult to capture on individual ones.

“The most telling KPI is often the simplest—how quickly can business teams access and act on trusted data,” says Pete DeJoy, SVP of products at Astronomer. “When you reduce that timeline from weeks to hours while maintaining security and governance standards, you create a compelling case for continued investment in dataops initiatives.”

Time to data

Time to data is a common metric used in dataops to measure any data processing and access delays. It’s an important metric for organizations that still have batch processing jobs running overnight to churn data, and today’s analytics only show yesterday’s or even older data.

Data trust

Another valuable metric, the data trust score can be a composite of business-relevant weightings of several indicators:

  • Data quality metrics such as accuracy, completeness, consistency, and validity.
  • User confidence scores measured through surveys, a data catalog, or service desk metrics related to data issues.
  • A governance metric related to how many data sets meet governance and security policies.

If you are on a data governance or security team, consider the metrics that CIOs, chief information security officers (CISOs), and chief data officers (CDOs) will consider when prioritizing investments and the types of initiatives to focus on.

Amer Deeba, GVP of Proofpoint DSPM Group, says CIOs need to understand what percentage of their data is valuable or sensitive and quantify its importance to the business—whether it supports revenue, compliance, or innovation. “Metrics like time-to-insight, ROI from tools, cost savings from eliminating unused shadow data, or percentage of tools reducing data incidents are all good examples of metrics that tie back to clear value,” says Deeba.

Dataops metrics

Dataops technical strategies include data pipelines to move data, data streaming for real-time data sources like IoT, and in-pipeline data quality automations. Using the reliability of water pipelines as an analogy is useful because no one wants pipeline blockages, leaky pipes, pressure drops, or dirty water from their plumbing systems.

“The effectiveness of dataops can be measured by tracking the pipeline success-to-failure ratio and the time spent on data preparation,” says Sunil Kalra, practice head of data engineering at LatentView. “Comparing planned deployments with unplanned deployments needed to address issues can also provide insights into process efficiency.”

Kalra recommends developing data observability practices, which involve monitoring data’s health, accuracy, and reliability throughout the pipelines.

“Successful adoption of observability across organizations relies on three key verticals: transparency, self-service, and tagging hygiene,” says Tameem Hourani, principal and founder at RapDev. “Measuring tagging accuracy and ensuring clean data is ingested accelerates the adoption of self-service, ensuring engineers and power-users across the organization have access to all the data they need.”

Tagging is one type of data enrichment that can be automated in data pipelines. Other dataops metrics can demonstrate the business impacts of operating robust pipelines at the required speed, quality, and efficiency.

Time-to-value, data quality scores, and automation rates demonstrate how efficiently data moves from ingestion to insights, while reductions in data-related incidents and compliance risks quantify operational resilience,” says Ashwin Rajeeva, CTO & co-founder of Acceldata. “By linking these KPIs to cost savings, productivity gains, and strategic growth, CIOs can drive greater investment, shift organizational culture, and establish data as a core competitive advantage.”

Paul Boynton, co-founder and COO of Company Search Incorporated, further suggests that “dataops should use KPIs like deployment frequency, incident response time, and data quality scores.”

Data governance metrics

Data governance metrics are thematic and focus on accuracy, completeness, timeliness, uniqueness, and compliance. Data governance metrics can support increased end-user adoption of data-driven practices.

“Measuring data security and governance effectiveness requires tracking three essential OKRs: time-to-access for data requests, amount of data without designated owners, and proportion of classified versus unclassified data,” says Pranava Adduri, CTO and co-founder of Bedrock Security. “CIOs should prioritize demonstrating improved data ownership clarity to reduce friction between security and development teams, minimize alert fatigue in SOC teams, and accelerate policy enforcement across disparate platforms. These OKRs also support organizations’ ability to scale data operations and rapidly secure data for emerging use cases like AI model training.”

Other recommendations focused on measuring the organizational impacts of data governance. Edward Calvesbert, VP of product management at IBM watsonx platform, says, “KPIs for a data governance program might entail reductions in data redundancy, improved data usage, cost savings from reductions in data wrangling, and faster time-to-market of new insights and applications.”

Compliance and risk management benefits are a third area to be measured. “Data governance generates savings in compliance costs, avoiding fines and reputational damage,” adds Calvesbert. “These are KPIs with impact that ripple far beyond the CIO’s office, unlocking opportunity across the enterprise.”

Akula of The Modern Data Company adds KPIs on the effectiveness of data governance programs. “For security and governance, track the sensitive data exposure rate, the percentage of critical data properly controlled, and the data duplication index, which measures unnecessary copies that indicate governance gaps,” Akula says.

Another important consideration is end-user adoption. Deeba of Proofpoint notes it’s also critical to measure how well teams are adopting governance policies and accessing data securely without friction.

Global and regulated organizations must also measure their data provenance and sovereignty practices, and these regulations also impact many smaller domestic businesses. Data provenance tracks the origin, history, and transformations of data throughout its lifecycle and is important in regulated industries where capturing data lineage is required. Data sovereignty is the legal and regulatory ownership of data based on the country or region where it is stored or processed.

“Organizations should measure their governance program’s effectiveness through quantifiable metrics that demonstrate sovereignty, like data lineage accuracy and data exposure incidents,” says Jeremy Kelway, VP of engineering for analytics, data, and AI at EDB. “Success indicators should include decreased sensitive data exposure risks, improved data locality compliance scores, and enhanced visibility into how data interfaces with AI systems across jurisdictional boundaries.”

What can be incredibly challenging is getting the business to collaborate in implementing data governance functions. Just getting data owners assigned and data sets classified can be an uphill battle. When data governance teams can’t get the collaboration, what other options should they consider?

Alastair Parr, executive director of GRC Solutions at Mitratech notes that “fostering a competitive data security and governance culture drives broader buy-in and success. KPIs and OKRs should incorporate department- and function-level comparative scorings based on the selected metrics,” he says.

Data security metrics

Experts suggest aligning data security metrics to standards such as ISO 27001, NIST CSF 2.0, and CIS.

Greg Anderson, CEO and founder of DefectDojo says, “In the simplest of terms, measuring effectiveness comes down to what framework and level you select, and then monitoring what percentage of your organization is in compliance. ISO 27001 is probably the most popular standard, but it’s also broad.”

Anderson suggests tracking the following metrics regardless of the frameworks being used:

  • Incident metrics, including the number of breaches and unauthorized access attempts.
  • The meantime to detect (MTTD) and respond (MTTR) to security issues and the speed of identifying and resolving threats.
  • Pass/fail rates for GDPR, HIPAA, and other compliance requirements.
  • Vulnerability metrics, including open vulnerabilities and patching frequency.
  • Training completion, such as the percentage of staff trained on security protocols.
  • The percent of sensitive data encrypted.
  • Access control metrics for addressing least-privilege access.
  • Percentage of data cataloged by severity and criticality (this metric works in collaboration with the data governance function).

Dataops, governance, and security metrics in practice

Kajal Wood, VP of software engineering at Capital One, shared a detailed perspective on how to put the theory of data effectiveness into practice. “Measuring effectiveness starts with building a well-governed and high-quality data ecosystem. To do this, we consider data quality metrics like accuracy, completeness, accessibility, and availability, to ensure teams can trust and use data effectively. Observability and security KPIs like data lineage coverage, ensuring all shared and used data is registered in the catalog, sensitive data detection and remediation, and incident response times demonstrate governance maturity. Dataops efficiency metrics like pipeline deployment speed, automation rates, and consumption experience reflect agility.”

The goal of such an encompassing list of metrics, Woods adds, “is to align these metrics with business outcomes—faster innovation, reduced risk, and improved decision-making—to unlock tangible value from data.”

A mature, data-driven organization can support metrics like these, but it takes time to develop the practices. Starting with fewer meaningful metrics is often better than having too many. Put your metrics through a simple three-question test:

  • Will the business understand the metric, and does it connect to value?
  • Does it measure where investments are being made and demonstrate improvements?
  • Is capturing the metric automated and easy to report on?

As more organizations invest in dataops, data governance, and data security, metrics that measure the value, operational efficacy, and risk involved are crucial.