As avid readers of our articles will know, the regulatory regime for video-sharing platforms is regulated by the Communications Act 2003 as amended by the Audio-Visual Media Services Regulations 2020.
The rules affect video-sharing services, social media sites that allow audiovisual content to be shared, and live-streaming audiovisual services such as pornography websites.
The rules require VSPs to put measures in place to protect minors from harmful content and protect the general public from incitement to violence or hatred as well as from criminal content like terrorism, child pornography, racism and xenophobia.
The required measures include requirements in user terms and conditions, functionality to allow users to report problems, age-verification systems, rating systems, parental controls and media literacy measures.
Ofcom is the statutory regulator, and it has now set out its key findings from the first year of regulation. When Ofcom started regulating the sector, it set out four broad aims: to raise standards, address non-compliance, increase transparency, and get industry and Ofcom ready for the future online safety regime.
Ofcom has statutory powers to collect information from notified VSPs. As a result of the information that it has collected, it has concluded that:
- All the regulated VSPs have safety measures in place, including rules on what kinds of video material are allowed. Some platforms made changes to their safety measures as a consequence of being regulated under the VSP regime.
- Platforms generally provided limited evidence on how well their safety measures are operating to protect users. This creates difficulty in determining with any certainty whether VSPs’ safety measures are working consistently and effectively.
- More robust measures are needed to prevent children accessing pornography. According to Ofcom, some adult VSPs’ access control measures are not sufficiently robust in stopping children accessing pornography. Ofcom has said that it expects them to make clear plans to implement more robust solutions and make these changes this year when possible.
- Some platforms could be better equipped for regulation, some are not sufficiently prepared and resourced. Ofcom will be looking for platforms to improve and provide more comprehensive responses to Ofcom’s information requests.
- Platforms are not prioritising risk assessment processes, which Ofcom believes are fundamental to proactively identifying and mitigating risks to user safety. Risk assessments will be a requirement on all regulated services under the online safety regime.
Ofcom has also set out its strategy and areas of focus for the next year. These will involve taking a broader look at the way platforms set, enforce, and test their approach to user safety, with a particular focus on robust age assurance. In summary, Ofcom will seek to:
- Ensure VSPs have sufficient processes in place for setting and revising comprehensive terms and conditions (generally known as Community Guidelines) that cover all relevant harms,
- Check that VSPs apply and enforce their Community Guidelines consistently and effectively to ensure harmful content is tackled in practice,
- Review the tools VSPs provide to allow users to control their experience and promote greater engagement with these measures, and;
- Drive forward the implementation of robust age assurance to protect children from the most harmful online content (including pornography). The ICO has issued some guidance on age verification measures as part of its Childrens Code which might help platforms comply with Ofcom’s expectations.
There are many references to the online safety regime in Ofcom’s report and it appears that Ofcom is “practising” for its role in regulating online safety in relation to video sharing. However, the Online Safety Bill was being reviewed by the Truss government, and with Rishi Sunak now taking over as Prime Minister, it remains to be seen what will happen with the Bill and the future regime.