HomeTechThe UK's Online Safety Act applies to Small Tech too

The UK’s Online Safety Act applies to Small Tech too

Date:

Related stories

UK Government Releases AI Action Plan

The U.K. government has released its “AI Opportunities Action...

Major UK airline announces new routes to Greece from 3 major airports

A major UK airline has announced a new route...

Teach First Appoints Unbound for UK Strategic and Creative Business | LBBOnline

The education charity Teach First will be working alongside...

U.K.’s Princess Kate announces she is in remission from cancer

LONDON — Catherine, the Princess of Wales, said Tuesday...
spot_imgspot_img

Analysis A little more than two months out from its first legal deadline, the UK’s Online Safety Act is causing concern among smaller online forums caught within its reach. The legislation, which came into law in the autumn of 2023, applies to search services and services that allow users to post content online or to interact with each other.

The government says it is designed to protect children and adults online and creates new legal duties for the online platforms and apps affected.

When one thinks of online harms — death threats, revenge porn, suicide encouragement etc — one thinks of the largest global platforms and services. But over the decades small hobbyist forums have sprung up on even the most niche topics, while retailers often allow customers to chat to each other to share ideas, challenges, and solutions.

Estimates suggest 100,000 such services will have to comply with the act, and some feel they don’t have resources to do the compliance work, or that their content is not relevant to the kind of harm the law is designed to prevent.

Nonetheless, according to law, regulator Ofcom could impose a fine of £18 million ($22 million) or 10 percent of their global revenue — whichever is greater — for failing to comply.

For the cycling forum dubbed London Fixed Gear and Single Speed (that’s LFGSS for short), the Online Safety Act was too much to bear.

“We’re done… we fall firmly into scope, and I have no way to dodge it. The act is too broad, and it doesn’t matter that there’s never been an instance of any of the proclaimed things that this act protects adults, children and vulnerable people from… the very broad language and the fact that I’m based in the UK means we’re covered,” the forum’s creator said in a post.

The Act applies to search services and services that allow users to post content online or to interact with each other, such as websites, apps and other services, including social media services, consumer file cloud storage and sharing sites, video-sharing platforms, online forums, dating services, and online instant messaging services. The Act also applies to services that have links to the UK, even if they are based outside it. If a service, platform or forum has a significant number of UK users, if the UK is a target market, or it is capable of being accessed by UK users and there is a material risk of significant harm to such users, then the law also applies.

Any entity to which the law applies will have to comply with the first tranche of obligations, from 17 March 2025 onwards.

Ben Packer, partner at law firm Linklaters, said the first involves a risk assessment which Ofcom says must be completed by the end of March 16. “It contains 17 different types of priority illegal content that organizations are going to have to run through and work out whether the risk of harm for each type of content is negligible, low, medium or high. It’s quite a detailed and involved process, and, as well as doing that, once they’ve done that risk assessment, they’re going to have to work out what measures to put in place to mitigate those risks.

“Every single service will have to go through all 17 categories of priority illegal content and consider ‘non priority’ illegal content too. This is the case even if you’re a small cycling based forum that only allows text based comments. That’s just the way that Parliament designed the act. Unfortunately, that’s not Ofcom’s choice to do it that way,” Packer said.

Who has to publish a summary risk assessment on their website? Sites with 34 million or more monthly active UK users. The threshold is lower for services that allow users to forward or reshare user-generated content and use a content recommender system: 7 million or more UK users. Organizations with a smaller number of users will simply have to have the document ready should Ofcom request it.

Ria Moody, Linklaters managing associate, said: “The practical journey [for smaller organizations] would be that Ofcom would write to these services and say, ‘Have you done your risk assessment? Please show it to us. We’re exercising our rights to request information from you as a service in scope.’ And then if the service hasn’t done a risk assessment, or the risk assessment wasn’t suitable and sufficient, which is the language that the Online Safety Act uses, based on Ofcom guidance and the principles they’ve set out, then Ofcom could take enforcement action.”

Once a service has completed a risk assessment, then Ofcom requires it to put in place the risk mitigations recommended in its Codes of Practice. Packer said: “Ofcom is saying they expect most services to have put in place almost all of the recommended risk mitigations by six months [after the March deadline], so September 2025. If after that, you still haven’t taken the measures to comply, that’s when they might start taking enforcement action. But what they’ve also said is that, in the meantime, they will still take action against deliberate or egregious breaches where there is a very significant risk of serious harm to UK users, especially children.”

The Codes of Practice measures are also categorized by size of service on the same criteria. Some measures will depend on functionality, for example, the ability to offer users controls to disable comments will only apply on a service which allows commenting on content. Meanwhile, some measures only apply if the relevant risks to which they relate are identified on the platform.

Moody said: “There are about 40 code measures which are the practical mitigation measures you need to put in place. For a small service [with less than 7 million monthly active UK users] which is low risk, of those 40-odd measures, about 14 apply.”

He added: “On a practical level, if you were a very low-risk forum, which is maybe only text-based, the actual amount of time it’s going to take you to assess risk on that site will likely be a lot lower than if you are a complex social media platform with all different kinds of content, all different kinds of functionality, and a really wide user base,” Moody said.

Nonetheless some forums in the UK remain concerned about the Act and the work it will require to comply, given they have never seen the kind of nefarious activity the Act is designed to prevent taking place in their community.

The Register spoke to an independently hosted outdoor sports forum with about 29,000 registered users and at most 2,000 active at any one time.

The lead moderator and technical manager, who asked not to be named, told us: “I don’t envy Ofcom because there must be hundreds of forums like ours. Technically, we could be used for these horrible things, but in 18 years, we have not. We get people ‘shouting’ at each other from time to time, which the moderators jump on, and people can post complaint. From a logical point of view, we’re very low risk,” he said.

child

Cutting kids off from the dark web – the solution can only ever be social

READ MORE

He said they would try to a complete a risk assessment as best he can with the time allowed, concluding that they are low risk and wait for feedback from Ofcom.

Ofcom has produced a regulation checker to help organizations understand if the Act applies to them.

A spokesperson said: “We know there’s a diverse range of services in scope of the UK’s new online safety laws. The actions sites and apps will need to take will depend on their size and level of risk. We’re providing support to online service providers of all sizes to make it easier for them to understand – and comply with – their responsibilities.”

As well as the Regulation Checker, Ofcom said it has an extensive program of work to make the regulations accessible, and compliance more easily attainable for all online services that fall in scope of the Act, which include many small or medium-sized enterprises.

In the next few weeks, Ofcom is set to launch a “Digital Support Service” consisting of interactive digital tools — accessible on the Ofcom website — for regulated organizations based on their perspectives and feedback. “Our first release will provide a four-step process for illegal harms, covering services’ risk assessment duties, Codes, and recordkeeping obligations,” it said.

Information published by Ofcom to help organizations and their lawyers achieve compliance with the new Act amounts to more than 2,000 pages. Hopefully, with the only tools promised in the next few weeks, smaller forums will be able to comply with wading through all the legal details.

What activity is the Online Safety Act trying to prevent?

Ofcom has tried to group “priority offences” into broad categories such as terrorism, hate offences, child sexual exploitation and abuse, and fraud and financial offences. They include, for example:

  • offences related to information likely to be of use to a terrorist and offences relating to training for terrorism
  • hate offences such as stirring up of racial hatred offence and stirring up of hatred on the basis of religion or sexual orientation
  • sexual exploitation of adults such as causing or inciting prostitution for gain offence
  • human trafficking
  • assisting or encouraging suicide offence
  • the unlawful supply, offer to supply, of controlled drugs, and the unlawful supply, or offer to supply, of articles for administering or preparing controlled drugs
  • weapons offences such as offences relating to firearms, their parts, ammunition, including air guns and shotguns

A full list of offences is available from Ofcom [PDF].

- Never miss a story with notifications

- Gain full access to our premium content

- Browse free from up to 5 devices at once

Latest stories

spot_img