Customer Analytics: Securing our Future
In the previous blog in this series I looked at how the use of persona, and creating customer journey maps for those persona, can give new insights on how to engage customers. By collecting customer data and tracking user interactions across all touchpoints, it’s possible to not only create an revealing profile of a customer’s interests and behaviour, but also better understand if the products and marketing are addressing customer’s wants and needs.
Technology is now rapidly opening up new sources of customer data including everything from health and well-being apps to our children’s activity online. Soon our heating systems, cars, refrigerators and every other connected appliance that’s about to hit the shelves will be churning out a stream of data back to the marketing departments of big corporations. The effective use of this big data undoubtedly has great potential to improve our lives, but there is also an increased risk that the individual’s privacy could be compromised, or the data could also be used to discriminate against an individual unfairly. In a recent case in the UK vulnerable customers were advised to buy energy on tariffs that were far higher than others available. The same is true of insurance and medical care, and many cases exist where it’s been found that corporations are exploiting data to prey on the vulnerable. Add to this the substantial risks of identity theft and fraud as a consequence of data breaches, and consumers could be forgiven for feeling anxious about how well their data is being protected, and how it’s being used.
In an effort to stop corporations from exploiting big data negatively and better protect consumers the European Union has raised the bar on data protection by drafting the the General Data Protection Regulation (GDPR). This new set of rules will apply not only to countries within the EU, but also to companies operating outside the EU, if they have networks or trade data with partners within the EU. Enforcement is expected to start in the spring of 2018.
The European commission’s website states that:
The objective of this new set of rules is to give citizens back control over of their personal data, and to simplify the regulatory environment for business.
Furthermore, as reported by consultancy group itgovernance,
The Regulation will enforce tough penalties – proposed fines up to 4% of annual global revenue or €20million, whichever is greater
For a large multinational the fines could get very substantial and significantly impact on share prices.
The GDPR sets out 8 Data Protection Principles that must be followed. The ICO, UK’s independent Information Commission, has provided clarification of how these principles need to be applied. For example, Principle 1 refers to Processing personal data fairly and lawfully. As described on their website,
In practice means that you must:
- have legitimate grounds for collecting and using the personal data;
- not use the data in ways that have unjustified adverse effects on the individuals concerned;
- be transparent about how you intend to use the data, and give individuals appropriate privacy notices when collecting their personal data;
- handle people’s personal data only in ways they would reasonably expect; and
- make sure you do not do anything unlawful with the data
Perhaps one of the key new areas is in the use of big data and analytics for customer profiling. This is covered under Principle 6 of the GDPR under the somewhat obtuse description of ‘automated decision making’. ComputerWeekly have produced a number of documents and guides to help businesses to understand how the GDPR will affect them. In one of their latest blogs they explain
The regulation introduces a number of restrictions on profiling, including the right for an individual not to be subject to a decision which significantly affects the individual and which is based on automated profiling. In many cases, profiling will only be permitted where the explicit consent of the individual has been obtained.
The implications for targeted marketing are unclear, but potentially significant, as it makes companies accountable for ensuring that customers
- Are made fully aware of exactly how their data will be used
- Explicitly agree to the way in which the data will be used
- Do not suffer any negative consequences from the use of their data.
As ComputerWeekly reports
These restrictions are likely to have a considerable impact on businesses engaging in, for example, big data analytics, as well as more general business activities, such as credit scoring and employee monitoring
In an article by the International Association of Privacy Professionals , they add that data subjects (customers)
…may also request to know the purposes of processing, the period of time for which data will be stored, the identity of any recipients of the data, the logic of automatic data processing, and the consequences of any profiling.
Corporations are now in real danger of drifting into a situation in which they are in breach of the new regulations, face substantial fines and seriously damage customer relations.
In the final blog of this series I will be looking at how customers are becoming increasingly concerned at the inability of corporations to keep their data safe, and how high publicity data breaches are eroding public confidence.
Mark Jenkins has worked in the IT industry for over 15 years as a BI and Analytics consultant, and more recently as ROC Product Manager for Subex Ltd. He has designed and deployed solutions for global companies in many sectors including Insurance, utilities and telecommunications. Mark holds a BSc Hons in Computer Science from Manchester University (UK).
Request a demo