Insights Big data and data protection in the advertising industry

Contact

Advertising is a data rich industry, with data now informing key decisions such as how much to bid for impressions, where to place adverts, who to target and the overall design of advertising campaigns.

The benefit of using mathematical models to manage the collection and analysis of data and to make data-based decisions has long been recognised but the advancement of cloud computing and machine learning technology (MLT) / artificial intelligence (AI) is taking things one step further – algorithms are now learning from the data they consume and adapting their output based on the data they process. MLT and AI thrive on access to large data sets and predictive models improve as they consume more data. The technology is primed to handle high-volume, real-time data with many variables, such as data collected from the internet about user habits and preferences.

Profiling and personalisation

The application of this technology to the advertising industry offers exciting potential. In particular, it is increasingly being put to work to enable personalised ad targeting – learning the type of advertising to which an individual is most responsive, tracking changes to an individual’s tastes and preferences and using that data to select products and adverts likely to be of most relevance and interest to that individual. Arguably this is in the spirit of the CAP code (the rules for non-broadcast advertising enforced by the industry regulator, the Advertising Standards Authority), which requires advertisers not to make persistent and unwanted marketing communications. However, the use of machine learning for customer profiling is not without legal risk.

Data protection considerations

In March 2017 the Information Commissioner’s Office (ICO) published a paper on big data, artificial intelligence, machine learning and data protection, encouraging organisations to bear in mind compliance with data protection legislation (and in particular the General Data Protection Regulation (GDPR) which will take effect on 25 May 2018) as they introduce automated decision processes into their business practices.

A key principle of data protection legislation is that personal data should be processed fairly, lawfully and in a transparent manner. In determining whether processing is fair, one has to consider the effect of that processing on the individual and the individual’s expectations of how that data will be used.

Under the Data Protection Act 1998, individuals have the right to find out what decisions are made about them using automated means and, if they significantly affect them, to prevent such decisions being made. This would include profiling, assuming it is conducted solely using automated means. The laws relating to automated decision taking are set to be supplemented under the GDPR, when it comes into effect next year. One key change is that individuals must be informed that profiling will be taking place and the consequences of it when their data is obtained – not just if they ask.

How might individuals be affected by profiling?

How could a decision about displaying a particular advert to a particular individual be deemed to ‘significantly affect’ that individual? The ICO paper highlights the potential for machine learning algorithms to make decisions that are potentially discriminatory. For example, the paper cites research in the USA which suggested internet searches for ‘black-identifying’ names generated adverts associated with arrest records far more often than those for ‘white-identifying’ names. Advertisers may need to consider ways to build discrimination detection into their machine learning systems and maintain human oversight of the adverts that are being displayed.

Advertisers should also consider whether an individual would reasonably expect to have their data used for big data analytics purposes, including profiling. This is sometimes a grey area. An example from the ICO’s paper is whether people who post on social media have a reasonable expectation that their data could be used for market research purposes. In determining whether this is an unfair use of personal data, much will depend on the level of transparency about how data is used.

Finding the right balance

What is clear is that the ICO believes that exploiting big data for commercial purposes and complying with data protection laws are not necessarily mutually exclusive. However, the advertising industry should recognise that bringing its operations into compliance with the GDPR will require careful thought and some resources – it is not going to be possible to achieve compliance overnight.