Thoughts. News. Opinions.

Developing a Data Ethics Policy for the Insurance Industry in the New Data Economy

The Danish insurance industry faces huge opportunities and challenges going forward in the economy. This demands for a thorough data ethics policy where difficult dilemmas are considered and both lines and principles are drawn. In collaboration with the Danish Insurance Association, we at Nextwork have developed a comprehensive framework for establishing af data ethics position and provided recommendations for a data ethics policy for the industry. Drawing on expert knowledge and case examples from the insurance and InsurTech industry. Here’s a somewhat short explanation for the need to develop a data ethics policy and summary of the key data ethical recommendations. Click here to get the ‘short’ version in Danish.

Forces at play in the new data economy

The insurance industry is and has always been a data-based industry and is currently undergoping extensive digitalization. Across the value chain, data – and more critically personal data – is the core of everything. From targeting potential customers, underwriting and risk calculation, managing retirement savings through ongoing advisory services and risk management to claims handling and fraud detection. But what happens when the amount of personal data available grows expansively within a short period of time? What opportunities and challenges does this pose for the insurance industry?

This is increasingly a reality not only the insurance industry but the entire world, due to the explosion of especially personal data that has transpired in the past couple of years. This is partly due to the rise of IoT (Internet of Things), where different devices are connected to the internet with the purpose of collecting and integrating data from the real world into the digital world. Just to put it into perspective – an astounding 90% of the data that is available to us today has been created during the past two years (World Economis Forum, 2017: The Value of Data).

The dilemmas that arise with a growing amount of data

  • Opportunities: The emerging amount of data on houses, water pipe conditions, driving behavior, weather concotions and a vast array of personal data can to a great extend be used to ’perfect’ the market in a number of ways. It can provide insurers with additional information, which in turn will make pricing more exact, as the figure below illustrates. This will ultimately benefit a lot of customers, who are currently paying more than necessary for their insurance product. The insurance fraud investigators can also combine multiple data points with AI, to more precisely detect fraudulent activity and avoid bothering honest insureds with unnecessary suspicion. And through a more or less direct specific access to critical health data – abiding the rules of privacy by design – the life insurance company can make sure that people with critical illness get their entitled claims payment right away. This means that a critically ill person can avoid facing a wall of bureaucracy in an already tough situation where the resources may be few.
  • Challenges: While the possibilities are many but so are the potential downsides. Let’s for instance look at the question of the uninsurable. Those whom few insurers will be able to insure because of their (as the data shows) particularly high-risk profiles. How far can discrimination go in the name of guaranteeing fairness in pricing? We already discriminate today based on age, in so far as young people pay more for car insurance than older people. That is a sort of discrimination based on few data points and broad categorizations of risk groups. So which kind of price discrimination is the most ethically justifiable? And how much ‘surveillance’ should insurers impose on customers in the name of giving them the best possible service and providing relevant value when certain life events occur? What is ‘cool’ and what is ‘creepy’? What is ethically justifiable and what is not?


First consideration – what is your ethical starting point?

Throughout the history of ideas, two competing ethical viewpoints have been dominating:

  • Utilitarianism, which is an ethical approach based on the consequences of actions
  • Deontological ethics, which is an ethical approach that always puts the individual at the centre and pose certain unbreakable rules regarding the rights of the individual

 Second consideration – more or less data?

  • It is possible to obtain ethically desirable ends through data minimization, which seems to be the default approach in the current data ethics debate, e.g. “What limits shall we set to constrain ourselves”. A clear example of this is when data security is being achieved through minimization – in other words ‘security by obscurity’ – which is often seen in today’s society. This is immensely dominated by the “better safe than sorry” philosophy. But this mindset does not come without consequences. You end up completely losing the value that personal data possesses.
  • In the other end of the spectrum, you can obtain ethically desirable ends through data maximization, utilizing personal data (sometimes, if necessary, even without consent). This can for instance be data concerning one’s health condition, exercise level, driving behavior, the condition of the water pipes in the cellar etc. This approach has historically been dominated by the utilitarian “for the greater good” philosophy since an immense amount of data could help researchers develop more efficient and personalized medicine and insurance companies could help prevent accidents or perhaps intervein before a person breaks down from stress. It does however come at the risk of data being misused. Recent technological advancements within encryption and privacy by design does, however, allow for the combination of “for the greater good” while maintaining full disclosure regarding privacy and individual consent.

A framework for developing a data ethics position

Through the basis of utilitarian ethics vs. deontological ethics and data minimization vs. maximization, we at Nextwork have created a framework for developing a data ethics positioning. Three basic data ethical positions are presented:


The Critical Position: Every individual must own all of his or her own data and the companies have to be very careful with the way they collect and use the data. Individuals and companies who subscribe to this position would argue for a limited access to data and data should always be deleted after the purpose for which it was collected has be fulfilled for the sake of the individual’s right to privacy.

The Progressive Position: Individuals can through individual control over the data that he or she produces and owns maximize the value of that same data. In this position the individual is encouraged to take ownership over personal data and has the ability to attain value by putting the data into use. The companies will in turn encourage this process not just to be GDPR compliant, but for the sake of the customers best interest as well as the company’s own interests.

The Aggressive Position: A great society must use everyone’s data for the sake of the common good. Stakeholders within this position would want personal data to subsidize innovation and create value for the entire society. It is less important that individuals have ownership over their data as long as it benefits a greater good. The common good would often be managed by institutions such as a research unit but it can also be managed by a public regulator og a private company.

The general data ethical themes that the insurance industry faces

Depending on your data ethics position, the answers to the major data ethical themes in insurance varies.

Nextwork has recommended the following to the industry:

Data Security We advised the industry to always have data security at the center of their data ethics and to keep Privacy by Design as the default setting. A secure data infrastructure is a pivotal requirement when data is being shared between individuals, companies and the public sector. The industry’s approach to data security should therefore center around position 2 – the progressive position.

Data (self)control and enablement We advised the industry to fight for the individual’s right to keep ownership of their data. The individual being in control of his or her own data is a crucial element in order to secure the basic trust that is fundamental when data is being shared between different agencies. The industry should make sure that the customer’s data is as easily accessible to the customer as possible, and that individuals always have the opportunity to improve their possibilities for obtaining the more precise coverage. Getting a more precise risk calculation by using more data is ethically justifiable by the fairness and solidarity principles, but it requires self-control of the data and, consequently, also a fair amount of digital literacy amongst the customers. The industry should therefore operate within position 2 – the progressive position – when working with data-control.  

Personalization We advised the industry to continue to use data to differentiate between groups of risk profiles and to identify high-risk groups. This is necessary to maintain an equal playing field in regards to foreign competitors and to create the best incentive for customers to improve their risk profiles. Using even more data-points when conducting risk calculation could also help break certain segmentations seen today that are in a sense ‘unfair’. Young drivers would for example get the opportunity to affect the premium of their car insurance, which today is highly affected by their age. In order for certain high-risk groups not to become entirely uninsurable, there should be a limit to the personalization based classification. Especially when it comes to those data-points that cannot be improved by the individual. The data ethical approach to personalization should therefore be grounded in position 2. There will however be certain contexts in the future where the industry could lean toward position 3 (the aggressive position), when it comes to for instance geodata of houses because the knowledge about for instance houses and flodding conditions  – and of course the potential solutions – is highly beneficial both for the individual, society and the insurance company.

Behavioral Change and Incentivization We advised the industry to give customers the ability to use their data for preventive efforts and behavioral regulation based on incentivization, as long as the customer is able to make an informed decision and have an genuine alternative to sharing data. The industry should also have the opportunity to access and use vast amounts of data when it subsidizes the common wealth of society at large such as cutting down medical expenses. It is also ethical responsible for the industry to store and use the data that is necessary to improve fraud detection. We therefore advise the industry to use a mixture of position 2 and 3, when dealing with behavioral change and incentivization. Read this article for more information and recommendations on countering insurance fraud.

Transparency – We advised the industry to provide full transparency when asking for access to the individual’s data. This is a key principle! The industry has a moral obligation to do so – and it’s in ther industry’s best interest – to make it easy for the costumers to understand what they are consenting to when handing over data. This also requires an enhanced amount of transparency about the risk calculation process, where it is pivotal to inform the customer which data points are being used to estimate their risk profile. We advise the industry to use principles from the schools of legal by design and ethics by design. The industry should therefore have position 2 as the centre of their transparency policy. Transparency combined with data security and data-(self)control is far more interesting and data ethically sound than merely focusing on securing privacy.