HomeInsightsDepartment for Digital, Culture, Media and Sport Select Committee publishes responses from Competition and Markets Authority and the Advertising Standards Authority to its report on influencer culture

Contact

+44 (0)20 7612 9612
info@wiggin.co.uk

The Committee’s report, published in May 2022, highlighted how the growth in the influencer market has exposed several regulatory gaps, particularly around advertising disclosure and protection for children, both as influencers and viewers. It called on the Government to strengthen employment law and advertising regulations.

The Committee recommended that the CAP code should require virtual influencers to be watermarked to flag that the influencer is virtual and make clear details of the owner.

In response, the ASA said that it had not received any complaints relating to virtual influencers and that it was not aware of research suggesting that virtual influencers are causing advertising-related harm. Further, watermarking the account of virtual influencers for purposes other than advertising would fall outside the advertising remit of the ASA system.

In response to the concern that the anonymous controller of the virtual influencer account is less likely to be deterred from non-compliance by ASA name-and-shame sanctions or other self-regulatory and statutory sanctions, the ASA said that one advantage of the ASA system is that it can tackle non-compliance via the influencer and/or the brand being marketed and/or the platform. It is not, therefore, reliant on remedying non-compliance via the influencer only or via the controller of the virtual influencer account. If necessary, the ASA system can work with third parties, including platforms, to seek to identify the contact details for the controller of the influencer account.

As for the concern that a web user has the right to know the controller of the virtual influencer account, the ASA said that it understood that the Online Safety Bill may give Ofcom powers to consider the circumstances in which the use of virtual social media accounts (used by people, organisations etc., which conceal their identities) may lead to harm and, in such circumstances, the duty of care that may apply to the platform in scope of the Bill.

The Committee also recommended that the remit of the CAP Code be extended to remove the requirement for editorial “control” to determine whether content constitutes an ad. In response, the ASA said that its current “payment” and “control” tests have always ensured that it does not inappropriately extend its regulation to editorial or sponsorship matters. Further, the ASA gives these tests a broad interpretation, which means that, in practice, most influencer posts fall within the remit of the ASA. However, the ASA will explore how it can clarify further that payment alone invokes advertising disclosure requirements and whether there is a case for reconsidering the ASA system payment and control tests as they apply to influencer and native advertising.

The Committee also recommended that the ASA be given statutory powers to enforce the CAP code and that these powers should be considered as part of the Government’s Online Advertising Programme (OAP). In response, the ASA said that the combination of enhanced statutory and self-regulatory enforcement powers, combined with new and impactful use of data science to monitor influencer ads at pace and scale, will prove effective in improving compliance outcomes in this aspect of regulation. To date, the ASA has not had the need to refer an influencer or a brand to a legal backstop for repeated non-compliance.

The Committee also recommended that the ASA update the CAP code to include mandatory enhanced disclosure standards for ads targeted to children. In response, the ASA said the CAP Code already includes mandatory enhanced disclosure standards for ads targeted to children because the underlying consumer protection legislation means that the regulator must consider any vulnerabilities in the audience of ads.

In response to the Committee’s recommendation that, as a result of the expansion of the market for child influencers and the considerable safeguarding concerns this has raised, a further review of the use of under-16s in marketing, focussing on the use and impact of child influencers, should be carried out, the ASA said that the limits of the ASA’s role in relation to the use of under-16s as brand ambassadors and peer-to-peer marketing remain. Further, the safeguarding concerns do not fall within the remit of the ASA system, given that the codes do not cover the relationship (employment or otherwise) between the marketer and the child or, indeed, the child and other parties, e.g., parents, guardians etc. Nonetheless, the ASA commits to review by the end of 2022 and, if necessary, update the guidance to ensure it remains fit and relevant to marketing communications that result from a reciprocal relationship between the marketer and a child, especially in relation to influencer marketing.

As for the CMA’s response, in relation to the Committee’s recommendation that, as a result of the risks involved in the use of machine learning technology the CMA should report yearly to Government on the development of their monitoring technology, the CMA pointed out that it is a full member, and the ASA is an associate member, of the Digital Regulation Cooperation Forum, which is driving regulatory best practice in the use of AI and machine learning, and that the CMA is committed to sharing its work with Government on a regular basis. The CMA also updated the Committee on its work investigating the role that social media platforms play in influencer endorsements. To access both responses, click here.