Insights 2024 Opinion on Age Assurance versus the 2021 Opinion – what’s changed?

Contact

On 18 January 2024, the Information Commissioner published his updated Opinion on Age Assurance, which replaces the Opinion from October 2021. Although in many ways the two Opinions are very similar, here are five interesting changes we spotted in the 2024 Opinion, with some practical thoughts for games companies:

The 2024 Opinion considers self-declaration age assurance in more detail than in the previous Opinion. The headline is that it can still be a suitable measure for lower risk services and that it benefits from being minimally intrusive.

Looking beyond this, because self-declaration can be easily bypassed, the ICO says that services could look out for ‘red-flags’ which contradict a person’s declared age or age range. The service provider could then ask the user to confirm their age using an alternative age assurance method.

For example, if a games platform shares age data with a games publisher, the games publisher could compare that dataset against any age data it collects itself and see if there are any discrepancies. If there are, the games companies could consider if they need to issue a new age assurance check and/or consider any unlawful processing that has occurred.

This topic is also further discussed outside of self-declaration in relation to age estimation (the ‘waterfall’ method).

At age 13 UK users are able to provide consent under Article 8 of GDPR and at 18 years old, UK users are recognised as an adult. On this basis, the Commissioner says that online services should consider whether further checks are required, so that users are able to access parts of the service which are appropriate for them.

Anecdotally, we expect re-performing such a check is going to be tricky for many games that rely on a self-declaration age gate where the user selects their age rather than enters their full date of birth. Equally, some games companies do not retain any age assurance data after the age check has been undertaken. In these cases, games companies may want to consider whether ‘periodic’ age checks for users are a proportionate way of dealing with this issue, also considering any wider risks in your game.

In the 2021 Opinion, specific legal bases were not explored for the deployment of age assurance measures. However, in the 2024 Opinion it states that ‘legitimate interest’ or ‘legal obligation’ are likely to be the most relevant grounds. For a ‘legitimate interest’ basis, it’s key for services to ensure they’ve completed the ‘three-part test’ which balances necessity with the rights and freedoms of users.

The Commissioner sets out that for ‘legal obligation’, age assurance may be required by “online safety legislation or gambling licence conditions” – more robust age assurance requirements under the Online Safety Act are expected and being signposted.

The updated Opinion touches upon the GDPR rights of children. Namely, that these are the rights of the child themselves (even if they’re too young to understand them). Therefore, services should only allow a parent / guardian to exercise these rights on their child’s behalf if:

  • the child authorises them do so;
  • the child does not have sufficient understanding to exercise the rights themselves; or
  • it is evident that this is in the best interests of the child.

The Commissioner also notes that in Scotland, there is a presumption that a child of 12 or over has sufficient understanding to be able to exercise their rights. There is no equivalent presumption anywhere else in the UK.

Clear and accessible information about a child’s GDPR rights is therefore recommended – see LEGO’s kid-friendly privacy policy and video as an example, as well as simple-to-access and use self-serve tools.

The 2021 Opinion covers age assurance that uses AI and the risks around algorithmic bias and statistical accuracy. However, the 2024 Opinion adds a new section on ‘algorithmic fairness’.

In short, the Commissioner heavily emphasises the importance of considering training data for AI models and ensuring that the data is diverse, high-quality and relevant. The performance of the system should be kept under review, particularly how it compares across different groups. This is particularly key if results generated by the AI system are being fed back into training the same system.

Any games company undertaking AI modelling (for age assurance or otherwise – e.g. to moderate content) should consider this – see the ICO’s guidance on AI and data protection for more information.