Insights Ofcom publishes analysis showing seven in ten Premier League footballers face Twitter abuse


Ofcom, which is preparing to regulate tech giants under new online safety laws, teamed up with The Alan Turing Institute to analyse more than 2.3 million tweets directed at Premier League footballers over the first five months of the 2021/22 season.

The study created a new machine-learning technology that can automatically assess whether tweets are abusive. A team of experts also manually reviewed a random sample of 3,000 tweets.

Ofcom found that:

  • most fans use social media responsibly; of the manually reviewed random sample of 3,000 tweets, 57% were positive towards players, 27% were neutral and 12.5% were critical; however, the remaining 3.5% were abusive; similarly, of the 2.3 million tweets analysed with the machine-learning tool, 2.6% contained abuse;
  • hundreds of abusive tweets are sent to footballers every day; while the proportion of abusive tweets might be low, this still amounts to nearly 60,000 abusive tweets directed towards Premier League players in just the first half of the season, an average of 362 every day, equivalent to one every four minutes; around one in twelve personal attacks (8.6%) targeted a victim’s protected characteristic, such as their race;
  • seven in every ten Premier League players are targeted; over the period, 68% of players (418 out of 18) received at least one abusive tweet, and one in fourteen (7%) received abuse every day; and
  • a handful of players face a barrage of abuse; the study recorded which footballers were targeted and found that half of all abuse towards Premier Leaguers is directed at twelve particular players; these players each received an average of 15 abusive tweets every day.

The study also asked the public about their experiences of players being targeted online through a separate poll. More than a quarter of teens and adults who go online (27%) saw online abuse directed at a footballer last season. This increased to more than a third of fans who follow football (37%) and is higher still among fans of the women’s game (42%).

Among those who came across abuse, more than half (51%) said they found the content extremely offensive, but a significant proportion did not take any action in response (30%). Only around one in every four (26%) used the flagging and reporting tools to alert the abusive content to the platform or marked the content as junk.

Ofcom says that tackling online abuse requires a team effort. Social media firms do not have to wait for new laws to make their sites and apps safer for users. Ofcom says that when it becomes the regulator for online safety, tech companies will have to be really open about the steps they are taking to protect users. The regulator will expect them to design their services with safety in mind.

Supporters can also play a positive role, Ofcom says. The research shows that most online fans behave responsibly, and as the new season kicks off Ofcom is asking them to report unacceptable, abusive posts whenever they see them.

The Online Safety Bill will introduce rules for sites and apps such as social media, search engines and messaging platforms, as well as other services that people use to share content online. The Bill does not give Ofcom a role in handling complaints about individual pieces of content. The Government recognises, and Ofcom agrees, that the sheer volume of online content would make that impractical. Rather than focusing on the symptoms of online harm, Ofcom says that it will tackle the causes by ensuring companies design their services with safety in mind from the start. It will examine whether companies are doing enough to protect their users from illegal content, as well as content that is harmful to children. To read Ofcom’s news release in full, click here.