LONDON — U.K. media regulator Ofcom — currently led by conservative Boris Johnson appointee Melanie Dawes — today issued “new guidance” for video-sharing platform providers regarding what the government considers “measures to protect users from harmful material.”
In the documents, Ofcom asserts that “pornography” is among the types of material with the most potential to harm under-18s. Even in defining its terms, however, Ofcom quickly blurs the line between legal and illegal by conflating child sexual abuse material and hate speech inciting violence with legal adult entertainment, under the blanket term “harmful material.”
Unlike U.S. legislation, U.K. government measures are not subject to anything like First Amendment or Section 230 scrutiny.
The official “Guidance” — which applies to U.K.-based platforms such as OnlyFans, PocketStars, TikTok, Snapchat, Vimeo and Twitch — states that Ofcom “has been given new powers to regulate U.K.-established video-sharing platforms (VSPs)” and that it “set out to protect users of VSPs from specific types of harmful material in videos.”
The goal, Ofcom declared, includes both “protecting under-18s from potentially harmful material” and also “all users” from “material inciting violence or hatred, and content constituting criminal offenses relating to terrorism; child sexual abuse material; and racism and xenophobia.”
Ofcom’s new guidance also “sets out a list of measures which providers must consider taking, as appropriate, to secure the required protections.”
A Vague, Discretionary Definition of ‘Harmful’ Content
Ofcom’s “Guidance” defines a vague category of “harmful material” that merges two unrelated types of content, aiming to protect:
a) the general public from “relevant harmful material.”
i) incitement to violence or hatred against particular groups
ii) content which would be considered a criminal offense under laws relating to terrorism; child sexual abuse material; and racism and xenophobia
b) under-18s from “restricted material.”
i) material which has, or would likely be given, an R18 certificate
ii) material which has been deemed, or would likely be deemed, unsuitable for classification (such as sadistic violence or torture)
iii) other material which might impair the physical, mental or moral development of under-18s
The items listed under category (b) could be interpreted as including all sexual expression, including educational material and works of art.
Explicitly blurring the distinction between categories (a) and (b), Ofcom states, “we refer to these two categories of material as ‘harmful material’ and discuss the two requirements collectively as the ‘requirement to protect users from harmful material.’”
Although the “Guidance” is described as “not a set of compulsory steps” but merely as “intended to help guide providers in deciding how best to comply with the statutory requirements,” the U.K. press is reporting that Ofcom has also indicated that “U.K. websites and apps that host pornography and adult material — such as OnlyFans and PocketStars — must put in place strict age-verification processes or face severe financial penalties,” including fines of £250,000 or 5% of applicable turnover, whichever is greater.
Ofcom’s “Guidance” also states that the government regulator is expecting private companies to exercise self-censorship and regulate content for them.
“While we acknowledge that harmful material may not be completely eradicated from a platform, we expect providers to make meaningful efforts to prevent users from encountering it,” the Ofcom document states.
In the U.S., any similar statement would most likely be struck down under current Section 230 protections.
The Power to Control Online Speech
Ofcom also asserts its power to completely control online content and speech, and to threaten companies with different levels of punishment, at its own discretion.
“Where Ofcom has concerns, we will act in accordance with our Enforcement Guidelines,” the guidelines state. “Where appropriate, we will generally seek to work with providers informally to try to resolve those concerns. Where serious concerns arise, we have the power to take swift and robust enforcement action, which may include sanctions. Sanctions could include an enforcement notification requiring the VSP provider to take specified actions, and/or impose a financial penalty. Ultimately, we also have the power to suspend or restrict a service in cases involving the most serious non-compliance.”
‘Pornography’ Listed Alongside Violence
Section 3.13 of the “Guidance” includes “pornography” among “other material that might impair the physical, mental or moral development of under-18s”:
The legislation does not specify particular examples of material that might impair the physical, mental or moral development of under-18s.
In order to support a greater understanding of this, Ofcom commissioned a wide-ranging research study into the risks and harms to children and young people being online, using social media and VSPs. The report goes beyond the VSP framework but providers may find it helpful to consider the report’s findings.
In particular, the following potential harms could be relevant to consider when drafting terms and conditions or acceptable use policies and determining which other measures it may be appropriate to take, to protect under-18s from material that might impair the physical, mental or moral development:
b) Self-injurious content which may cause physical harms, such as material promoting eating disorders, self-harm and suicide;
c) Mental health and wellbeing factors which may lead to a harm, such as psychological distress, depression, anxiety, social withdrawal, body image and addictive-type behaviors;
d) Aggression, including hate speech, violent material, dangerous behavior, cyberbullying, online harassment, and cyberstalking;
e) Manipulation intended to harm, through image, AI and algorithmic manipulation; profiling and persuasive design including nudging and targeting leading to a detrimental impact on under-18s.
The document defines “pornography” as “videos whose primary purpose is sexual arousal or stimulation,” and cites the current ratings system for theatrical motion pictures.
Ofcom to Decide What’s ‘Primarily Pornographic”
The “Guidance” also creates a specific type of legal entity, the “platform [that] specializes in restricted material of a pornographic nature,” which is subject to a vague exception from the duties of other “non-specialized” sites, a carve-out that is spelled out in section 4.23:
4.23 We recognize that for some types of platform, the inclusion of terms and conditions requiring users to notify the provider if they upload restricted material may not be necessary. For example, if the platform specializes in restricted material of a pornographic nature. In such cases we expect providers to implement other measures to protect under-18s from videos containing restricted material, such as having appropriately robust age assurance systems in place.
Section 4.112 essentially empowers Ofcom to become the state censor to assess what is and isn’t “primarily pornographic”:
4.112 Should Ofcom be required to make an assessment about whether a platform requires such measures, some of the indicators that we might consider in making this assessment include:
• How much pornography is on the platform. This could relate to the absolute number of videos; the ratio of pornographic to non-pornographic content; or the relative number of sub-sections of the site dedicated to pornography.
• The significance of pornography to the service. Where the site allows for users to subscribe to the accounts of content creators, this might relate to how many of those accounts post pornography, or the number of subscribers pornographic accounts have.
• The way the service is positioned in the market. This could be how it brands its own offering, or how the platform is viewed by users.
• Third-party insights which indicate the service specializes in pornography, or that there is a high risk of under-18s being able to access pornographic material on the platform.
Melanie Dawes ‘Gearing Up for’ More Censorship
Ofcom CEO Melanie Dawes — a conservative politician appointed by Boris Johnson, with a long career under David Cameron and Teresa May, said that “the likes of TikTok and Snapchat could not address the new rules by setting up youth-specific platforms and had to focus on their main services,” The Guardian reported.
“It is not, in our view, any good to introduce a youth site like a ‘young Instagram,’” she stated. “You have got to address the issues on the main site.”
Dawes added that “online videos play a huge role in our lives now, particularly for children. But many people see hateful, violent or inappropriate material while using them.”
Pointing out that her mission of online content censorship has only just started, Dawes also noted that “the platforms where these videos are shared now have a legal duty to take steps to protect their users. So we’re stepping up our oversight of these tech companies, while also gearing up for the task of tackling a much wider range of online harms in the future.”
YouTube and Facebook are not covered by these regulations because they are based in Ireland, which — unlike post-Brexit U.K. — is still part of the EU.
Main Image: Boris Johnson-appointed Ofcom CEO Dame Melanie Dawes