Insights Digital technology and children – what’s your approach?

Regulators are becoming increasingly concerned about the use of digital technology by children. Many online products and services can’t distinguish between users of different ages and some have been developed with a view to appeal to children.

There are, of course, products and services that are illegal for children to use. However, the lines appear blurred in relation to products and services that may not be harmful to children – some of the offerings of the digital environment can be enjoyable, educational and even help children with the development of skills. Further, digital technology has become an integral part of everyone’s lives (including children’s) and to attempt to carve children out is simply not realistic.

For many products and services, access by children is not the issue that requires attention. Often, it’s some of the features within products and services that are worth focusing on. By way of example, games publishers have been focusing on the implementation of playtime limits, and the design of features that involve behavioural profiling and in-game interactions with other players in a way that would not affect children.

The move towards adjusting these features is reflective of industry trends. The proliferation of playtime limits follows the prevalence of concerns in connection with time spent by children playing games. This is not a simple issue, as practices for encouraging longer playtimes are often overlooked, whereas arguably the adjustment of these practices might, in many scenarios, be more effective over the implementation of specific playtime limits.

As children are also seen as a more vulnerable group of players, there is an expectation that they will not be involved in profiling practices connected to targeted advertising.

Multiplayer games allowing interaction with other players are currently amongst the most popular games. Creating a platform for children to interact with strangers via these games can put players at risk of exposure to inappropriate interactions and content. Most games services will therefore add additional protections for younger children, such as limiting connections to ‘friends’ and requiring prior parental consent.

In connection with measures that should be taken to protect children, the messaging from the ICO appears to be that one size does not fit all. A ‘risk based approach’ is recommended, which in practice translates to considering the age group of users against the aspects of the product or service that might be potentially harmful and deploying appropriate safeguards (otherwise known as ‘age-appropriate application’).

Integral to this approach is, of course, understanding the age group of users. The ICO uses the term ‘age assurance’ to refer collectively to approaches ‘used to provide assurance that children are unable to access adult, harmful or otherwise inappropriate content … andestimate or establish the age of users’ so that digital products and services can be adjusted to meet the level of protection appropriate to the age of users.

It’s important to note that age assurance tools are often criticised for requiring the collection of data about children beyond what’s necessary to provide the product or service, thereby contradicting the data minimisation principle and perhaps increasing risks associated with the collection of larger amounts of data.

At the same time, a balance needs to be achieved between concerns for data collection and implementing effective age assurance tools. The simpler techniques for age assurance which require the least amount of data (such as self-declaration without providing any evidence to back up the declared age) are clearly easy to circumvent, for example, by providing incorrect information. For this reason, further steps to supplement these measures (such as age verification through providing copies of documents) could be implemented to help allocate users to appropriate age groups, with a degree of certainty.

The regulatory approach will continue to evolve in line with public concerns and developments in different industries. The recommended risk based approach means that safeguards implemented across different providers may (and should actually) vary according to the nature of the product and service.

We’re following the developments in this exciting area and await to see the results of new guidance, technological developments and industry shifts.