Privacy, Danger And Widespread Discrimination: Should The UK Government Consider Action Against TikTok?


TikTok, in the UK, is an incredibly popular, albeit controversial platform. This article will outline the many negatives of this platform, such as the recurrent disregard the platform has for child privacy laws, the vast dangers that this app can pose to children, as well as blatant discrimination, questioning whether the app is in compliance with British laws.

Firstly, TikTok appears to have violated child privacy laws several times, and has refused to rectify this issue, despite previous lawsuits. The app was shown to have broken data protection laws in the EU and the UK. This breach occurred through collecting data from children under the age of 13, and subsequently selling it to third party companies, as a means of further understanding users’ personal preferences, increasing advertising revenue. Furthermore, the company was found in contempt of the Children's Online Privacy Protection Act of 1998 (COPPA), in the US by the Federal Trade Commission. In addition to being fined £4.3 million in the United States, the platform was fined £123,000 by the Korea Communications Commission. Abroad, courts have highlighted a blatant disregard for child privacy laws and consequently the safety of children, and it may be beneficial for the UK government to seek legal action against the platform (potentially resulting in its removal). This would also act as a strong deterrent for other platforms, emphasising that the privacy obligations of a social media company, with respect to minor users, must be of paramount concern.

Secondly, the dangers present on TikTok also exist in relation to its renowned dangerous challenges. One of the most recent challenges was the ‘Blackout-challenge’. This challenge involved encouraging users to choke themselves until they were unconscious. As a result of this, many children attempted this act, and it led to 12-year-old Joshua Haileyesus being left braindead, in March 2021, after attempting this challenge. In addition, a 10-year-old girl from Italy has allegedly died after taking part. However, this is only a drop in the ocean of the dangerous challenges present on this platform. A plethora of other dangers, such as the ‘penny-challenge’, which involves users sticking a penny in a live electrical socket, pose a grave danger to young, impressionable children, hence placing users at risk. Article 10 of the UK Human Rights Act 1998 gives social media users the right to hold an opinion. However, this right is not absolute, meaning the government has the right to censor posts which may compromise public health. We have seen similar censor messages online relating to COVID-19 misinformation posts, but TikTok has not applied similar warnings on videos promoting dangerous trends towards children. As these challenges spread at an incredible velocity, making it impossible to counteract with information, it is imperative that public authorities use Article 10, to censor, or potentially ban TikTok, as was heavily debated in the US last year. Steps must be taken in order to act against the spread of this behaviour, to protect children and young people, who are most at risk of these dangerous activities.

Finally, due to its association with the Chinese Communist Party (CCP), TikTok has been prolific in its use of censorship in China, in order to remove Uighur minority posts (of whom 1 million have been placed in concentration camps). Censorship of views critical of the CCP has also been found in other countries, such as Indonesia, the US and the UK. In November 2020, a former executive stated to the UK parliamentary committee that the platform censored content critical of the Uighur genocide. Beijing moderators, due to their moderation structure, always have the final say in what content was approved, ensuring that all rubber-stamped content was in line with CCP views. In addition, TikTok was involved in a scandal, which led to content posted by LGBT users being removed. This appears to fly in the face of the Equality Act 2010, which was designed to protect users from discrimination from institutions, such as TikTok, and protects against prejudice due to sexuality, political views or other protected characteristics. Due to censoring and applying the rules of moderation unfairly towards these groups, TikTok appears to be in contravention of this act.

In conclusion, TikTok is a platform that has repeatedly been shown to be acting in apparent contravention of UK law, whether this relates to data protection, privacy, or discrimination. TikTok’s popularity, especially amongst younger people, seems to be here to stay, and so too are these dangers, if TikTok itself, or the UK government, do not intervene.


This article was submitted by Lewis A. from Bristol.


Dangers of TikTok challenges:
https://www.indy100.com/news/boy-dies-tiktok-blackout-challenge-b1831210

https://www.foxnews.com/us/colorado-boy-12-brain-dead-after-trying-tiktoks-blackout-challenge

Chinese Communist Party’s actions towards the Uighur minority
https://www.bbc.co.uk/news/world-asia-china-55794071

TikTok fined in South Korea:
https://www.bbc.co.uk/news/technology-53418077

TikTok COPPA privacy concerns:
https://www.businessinsider.com/tiktok-children-users-data-privacy-safety-concerns-coppa-ftc-report-2020-8?r=US&IR=T

TikTok censorship in countries such as Indonesia:
https://www.reuters.com/article/us-usa-tiktok-indonesia-exclusive-idUSKCN2591ML

TikTok censoring content:
https://www.reuters.com/article/britain-tech-lgbt-idUSL5N2GJ459

TikTok moderation policies:
https://www.forbes.com/sites/siladityaray/2020/10/05/ex-chinese-government-official-was-in-charge-of-tiktoks-content-moderation-policies-report-says/?sh=36fd12ec47c4