Skip to main content

The U.K.’s internet safety regulator, Ofcom, has released a new draft guidance to help companies meet their legal obligations to protect women and girls from online threats, including harassment, bullying, misogyny, and intimate image abuse, as part of the implementation of the Online Safety Act (OSA).

The government has made it clear that protecting women and girls is a top priority in the implementation of the OSA, with certain forms of abuse, such as sharing intimate images without consent or using AI to create deepfake porn, being explicitly identified as enforcement priorities.

The online safety regulation, which was approved by the U.K. parliament in September 2023, has faced criticism for not being strong enough to reform platform giants, despite containing significant penalties for non-compliance, including fines of up to 10% of global annual turnover.

Child safety campaigners have also expressed frustration with the slow pace of implementation and doubts about the law’s effectiveness.

In an interview with the BBC in January, Technology Minister Peter Kyle described the legislation as “very uneven” and “unsatisfactory,” but the government is pushing forward with the approach, which requires parliament to approve Ofcom’s compliance guidance.

Enforcement is expected to begin soon for core requirements related to tackling illegal content and child protection, while other aspects of OSA compliance will take longer to implement, with Ofcom conceding that the latest guidance won’t become fully enforceable until 2027 or later.

Approaching the Enforcement Start Line

“The first duties of the Online Safety Act are coming into force next month,” said Jessica Smith, who led the development of the female safety-focused guidance, in an interview with TechCrunch, adding that Ofcom will start enforcing some of the core duties ahead of the guidance becoming enforceable.

The new draft guidance on keeping women and girls safe online is intended to supplement earlier Ofcom guidance on illegal content, which provides recommendations for protecting minors from seeing adult content online.

In December, the regulator published its finalized guidance on how platforms and services should reduce risks related to illegal content, with a focus on child protection.

Ofcom has also produced a Children’s Safety Code, which recommends online services implement age checks and content filtering to prevent kids from accessing inappropriate content, and developed recommendations for age assurance technologies for adult content websites.

The latest guidance was developed with input from victims, survivors, women’s advocacy groups, and safety experts, and covers four areas where females are disproportionately affected by online harm: online misogyny, pile-ons and online harassment, online domestic abuse, and intimate image abuse.

Safety by Design

Ofcom’s top recommendation is for in-scope services and platforms to take a “safety by design” approach, with Smith urging tech firms to “take a step back” and think about their user experience in a holistic way to prioritize the safety of women and girls.

Smith highlighted the rise of image-generating AI services, which have led to a significant increase in deepfake intimate image abuse, as an example of where technologists could have taken proactive measures to mitigate risks but did not.

Ofcom highlights examples of “good” industry practices, including removing geolocation by default, conducting ‘abusability’ testing, taking steps to boost account security, designing user prompts to make posters think twice before posting abusive content, and offering accessible reporting tools.

Not every measure will be relevant for every type or size of service, as the law applies to a wide range of online services, and companies will need to understand what compliance means in the context of their product.

Transparency

Smith suggested that Ofcom’s response to industry shifts that may increase online harms will focus on using transparency and information-gathering powers to illustrate impacts and drive user awareness, with a ‘name and shame’ approach to highlight companies that are not meeting standards.

Ofcom will produce a market report on who is using the guidance, who is following what steps, and what kind of outcomes they’re achieving for users who are women and girls, to shine a light on what protections are in place on different platforms.

Smith emphasized that companies wanting to avoid reputational harm can turn to Ofcom’s guidance for practical steps to improve the situation for their users.

Tech to Tackle Deepfake Porn

Ofcom is beefing up its recommendations on intimate image abuse, suggesting the use of hash matching to detect and remove such abusive imagery, and plans to update its earlier codes to incorporate this change in the near future.

The regulator has seen a substantial increase in deepfake intimate image abuse, particularly in relation to AI-generated content, and has gathered evidence on the effectiveness of hash matching to tackle this harm.

The draft guidance will undergo consultation until May 23, 2025, after which Ofcom will produce final guidance by the end of the year, with the first report reviewing industry practice in this area to be produced 18 months later.

Smith acknowledged criticism that the OSA is taking too long to implement but emphasized the importance of consulting on compliance measures, and anticipated a shift in the conversation surrounding the issue once the final measures take effect.


Source Link