Due to the wide range of potential datasets and use cases, as well as the relative infancy of data valuation, there are no simple or universally agreed upon methods. High option value and externalities mean data value may fluctuate unpredictably, and seemingly worthless data may suddenly become extremely valuable at an unspecified future date. Information-theoretic measures, such as
entropy,
information gain, and information cost, are useful for anomaly and outlier detection. In data-driven analytics, a common problem is quantifying whether larger data sizes and/or more complex data elements actually enhance, degrade, or alter the data information content and utility. The
data value metric (DVM) quantifies the useful information content of large and heterogeneous datasets in terms of the tradeoffs between the size, utility, value, and energy of the data. Such methods can be used to determine if appending, expanding, or augmenting an existent dataset may improve the modeling or understanding of the underlying phenomenon.
Infonomics valuation models Doug Laney identifies six approaches for valuing data, dividing these into two categories: foundational models and financial models. Foundational models assign a relative, informational value to data, where financial models assign an absolute, economic value.
Foundational models •
Intrinsic Value of Information (IVI) measures data value drivers including correctness, completeness and exclusivity of data and assigns a value accordingly. •
Business Value of Information (BVI) measures how fit the data is for specific business purposes (e.g., initiative X requires 80% accurate data that is updated weekly – how closely does the data match this requirement?). •
Performance Value of Information (PVI) measures how the usage of the data effects key business drivers and KPIs, often using a control group study.
Financial models •
Cost Value of Information (CVI) measures the cost to produce and store the data, the cost to replace it, or the impact on cash flows if it was lost. •
Market Value of Information (MVI) measures the actual or estimated value the data would be traded for in the data marketplace. •
Economic Value of Information (EVI) measures the expected cash flows, returns or savings from the usage of the data.
Bennett institute valuations Research by the Bennett Institute divides approaches for estimating the value of data into market-based valuations and non-market-based valuations. •
A consumption-based approach builds on the principles in the modified cost value approach by assigning data users different weightings based on the relative value they contribute to the organization. These weightings are including in the modelling of data usage statistics and further modify the measured value of data. •
Data hub valuation uses a cost-based approach that measures the cost of data hubs where large repositories of data are stored, rather than measuring the cost of separate datasets. The data hub cost can then be modified, as in the consumption based and modified cost value approaches. Another hub valuation approach uses a modified market value approach, by measuring savings to users from accessing data via hubs versus individually accessing data from producers, and user willingness-to-pay for access to data hubs. •
A stakeholder approach engages key stakeholders to value data, examining how data supports activities which external stakeholders identify as creating value for them. It uses a model that combines the total value created by the organization, a weighted list of value creating initiatives (as defined by external stakeholders) and an inventory of data assets. This approach was developed in a collaboration between Anmut, a consultancy firm, and Highways England, a public sector agency for which data valuations based on market value, income gains or economic performance are less meaningful. The approach can also be applied in the private sector.
Companies performing Data Valuations getting Data Assets on Balance Sheet •
Data Capitalisation Partners (DCP: Data Capitalisation Partners' mission is the listing of Data on Balance Sheet as an independent asset class by applying GAAP, IFRS & AASB Accounting Standards and International Valuation Standards Counsel (IVSC) compliance rigour to issue mathematically defensible valuations that transform unrecognised intangible data assets into collateralisable financial assets to significantly increase Enterprise Value. DCP does this by leveraging Specialised Data Asset Valuation methods for Collateralisation of Data & Capital Markets. • Data Capitalisation Partners delivers IVSC-compliant benchmarks to provide definitive Data Asset values, equipping organisations with rigorous methodology and supporting Accounting frameworks to achieve formal Balance Sheet recognition of Data Assets, ensuring transparency for Auditors, Regulators and Global Capital Markets. The team have realized over $8.2 Billion in data asset value for companies.. • DCP Valuation methodologies ensure full IVSC compliance, providing mathematically defensible Data Asset pricing, the foundation of regulatory and audit transparency. Establishing rigorous, evidenced-based values for intangible Data Assets mitigates systematic risk and proactively counters formation of 'asset bubbles.' This precision-driven framework fundamentally de-risks the capital structure, safeguarding investor interests while providing the critical transparency required in corporate and institutional debt and equity markets.
Data Valuation as a Service provides: • A data valuation report from 17 different data valuation methodologies and calculations to create a defensible valuation of your data unique to your company and its data. • An interrogation of data via data due diligence and for strategy, security, governance, monetization, substantiation, security, privacy and people • A data monetization strategies review against each
use case in order to glean as much current and future value of data as possible. • Analytic evidence of data value as well as model forecasts for data drivers, use case, and monetization impacts to the data valuation of your data ==References==