Data Valuation approaches

Even though data is regarded to as the currency of the digital economy and that the amount of data is developing exponentially, there currently is no international standard for data valuation. Last month, the New York Times rightly noted that a standard method would at least provide a common framework for companies and regulators (New York Times, How Much Is Your Data Worth?, March 25, 2019).

As a result, various approaches to data valuation in academia and business have evolved and will be reviewed below:

(Market prices: Market prices for data are the exception. Companies rarely make their data available externally in order not to lose their competitive advantage. Nevertheless, there are marketplaces for the monetization and purchase of data, such as the French company Dawex, which even allows auction-based data trading.

However, the comparability of data sets is limited. Data from the same industry and the same area, for example IoT sensor data of the brakes of a car, can have different data qualities which is reflected in different prices.)

Use:

The first approach to financial data valuation probably dates back to 1993 (R. Glazer. Measuring the Value of Information: The Information Intensive Organization). This scoring approach very simply shows how turnover and costs change with and without a specific data set.

Costs:

The cost-based approach is based on the assumption that data is worth what its collection costs. This proposal from 1999 (D. Moody, P. Walsh: Measuring the Value of Information: An Asset Valuation Approach) no longer seems appropriate in times of social media, for example, since the customers of companies also make their data available free of charge and to a large extent.

Security risks:

One year later, in 2000, the proposal was made to valuate data based on their security risks (R. S. Poore: Valuing information assets for security risk management). The reason for this is that costs alone do not sufficiently reflect the risks of data collections within companies and the risks affect the value of the data. Three risks were described as examples to be taken into account in the valuation:

1. Security: If a company’s data exposes a person to the risk of injury, illness or death, additional costs may arise for litigation, among other things.

2. Politics: If a company is reliant on information that may be restricted or prohibited by future legislation, there is a risk that the data may no longer be usable and therefore be worthless.

3. Crime: If a company, or a manager of that company, is subject to criminal sanctions as a result of the data, an appropriate valuation discount must also be applied.

Added Value:

In 2016, the then CTO of DELL EMC, Bill Schmarzo, described a five-step data valuation process in a blog post. This process is based on Glazer’s approach from 1993 and thus again focuses on the added value that can be achieved from the data. In 2017, the Organization for Economic Cooperation and Development also endorsed this view in the OECD Transfer Pricing Guidelines.

According to Schmarzo, the first step of the process should identify the business branches that have data as a basis and provide a rough estimate of the financial impact of a business initiative on sales.

The second step combines stakeholder interviews with a moderated workshop to identify/discuss the concrete decisions stakeholders need to make to support the intended business initiative.

The third step is to quantify the economic value of these individual decisions. A range is formed and the most likely value (e.g. mean, mode, median) is selected.

In the fourth step, the value of each data source is estimated for each of these individual decisions, and in the fifth step, the overall economic value of each data source can be determined individually. You can read more about this at https://infocus.dellemc.com/william_schmarzo/determining-economic-value-data.

Currently, there are further suggestions in literature based on this idea as to the extent to which companies can implement these approaches. For this purpose, the Cross-Industry Standard Process for Data Mining (CRISP-DM) established in the IT industry or the business Stage-Gate model in conjunction with the IDW S 5/IAS 38 valuation standard are used to determine the business branches and data sources. 

I will publish an additional contribution on the basis of an valuation example in order to be able to illustrate this more clearly.

Shapley Value:

As early as 2015, the tax literature suggested that value added contributions in digital business models should be divided between associated companies using the Shapley Value (Pellefigue, J:, International Transfer Pricing Economics for the Digital Economy). The Shapley Value approach from the 1950s, for which Lloyd Shapley was awarded the Nobel Prize for Economics in 2012, can be attributed to cooperative game theory.

In 2019, scientific articles were published that also approach the Shapley Value as a basis for evaluating data. (Ghorbani, A./Zou, J.: Data Shapley: Equitable Valuation of Data for Machine Learning or Jia, R. et. al: Towards Efficient Data Valuation Based on the Shapley Value).

Due to its complexity, I will also publish a separate article on this topic, which tries to present this complex topic as simply as possible.

Next year, according to projections by the telecommunications company Cisco, 254 exabytes (254 billion gigabytes) of data are expected to be transferred over the Internet. An extensive automation of data valuation for companies with simultaneous valuation transparency for shareholders and stakeholders would therefore be desirable.

Leave a Reply

Your email address will not be published. Required fields are marked *