On the 1st November 2020, the United Kingdom’s implementation of the European Union’s Audiovisual Media Services Directive (AVMSD) came into force through implementing The Audiovisual Media Services Regulations 2020. It is likely to be the last significant EU directive implemented in the United Kingdom.
The Regulations will apply to any online service (including social media platforms) that permit the uploading of video content that is subject to regulatory scope in the UK.
Platforms with UK operational entities such as Snapchat, TikTok and Twitch will fall under its scope. However, owing to their Dublin-based European headquarters, websites such as Facebook and YouTube will fall under the scope of the equivalent regime in Ireland.
Platforms will be required to register to and be regulated by Ofcom and will have to pay an annual fee to the regulator.
The aim of the Directive and implementing Regulation is to tackle harmful content in online video. Article 28b of the AVMSD outlines these aims in further detail:
Without prejudice to Articles 12 to 15 of Directive 2000/31/EC, Member States shall ensure that video-sharing platform providers under their jurisdiction take appropriate measures to protect:
(a) minors from programmes, user-generated videos and audiovisual commercial communications which may impair their physical, mental or moral development in accordance with Article 6a(1);
(b) the general public from programmes, user-generated videos and audiovisual commercial communications containing incitement to violence or hatred directed against a group of persons or a member of a group based on any of the grounds referred to in Article 21 of the Charter;
(c) the general public from programmes, user-generated videos and audiovisual commercial communications containing content the dissemination of which constitutes an activity which is a criminal offence under Union law, namely public provocation to commit a terrorist offence as set out in Article 5 of Directive (EU) 2017/541, offences concerning child pornography as set out in Article 5(4) of Directive 2011/93/EU of the European Parliament and of the Council (*1) and offences concerning racism and xenophobia as set out in Article 1 of Framework Decision 2008/913/JHA
In short, platforms will be required to take steps to ensure that harmful content is not uplaoded or disseminated on their platforms. This legislative response aims to update the existing laws to reflect the challenges associated with the modern Internet. Given the supra-national nature of the Internet, it will be interesting to see if this Directive takes on a extraterritorial element - with platforms opting for a global approach rather than just introducing controls for specific EU Member States.
The term “appropriate measures” introduces an element of vagueness into the obligations of online video sharing platforms (VSPs). This may create issues in relation any control mechanisms that platforms seek to implement; Ofcom is yet to publish any guidance on what “appropriate measures” constitute.