The laws in place today have evolved to tackle privacy issues regarding the misuse of our data. The main issue in question is to what extent is the General Data Protection Regulation (GDPR) (2016) sufficient enough to put a stop to the mishandling of our personal data for political manipulation.

The rationale behind implementing the GDPR

The Data Protection Act (DPA) 1998 endorsed provisions for the regulation of the processing of individuals’ information, including the obtaining, holding, or disclosure of such information. As time evolved and with the growth of social media and the use of personal data, it became clear that stronger data protection laws had to be implemented, specifically concerning the handling and storage of personal data.[1] On 25th May 2018, the GDPR took effect across the EU, providing stringent obligations for those using personal data and stronger rights for individuals, both within and outside the EU.[2] This regulation was designed to harmonise data protection laws throughout Europe, enhance data transfer rules outside of the EU, and provide greater control over one’s personally identifiable information.[3] The latter was achieved through placing restrictions on the processing of this data by limiting the processing element, as this would encourage the use of anonymous data instead.[4] This is a great step towards protecting a user’s data as anonymous data does not fall within the classification of personal data, therefore it lies outside of the scope of the GDPR and can be used freely with no limitations. This is a potential method which social networking websites such as Facebook could adopt, instead of misusing personal data, especially when no consent is retrieved. The confidentiality of users will be maintained, and privacy breaches should be limited. However, people may be sceptical as research in the United States has suggested that over 99 percent of Americans could be re-identified using 15 demographic attributes thus defeating the goal of anonymisation.[5]

Issues which the GDPR addressed

The DPA had several weaknesses which the GDPR attempted to resolve. The first concerned the lack of harmonisation of laws across the EU.[6] A 2009 report by the RAND corporation indicated that the Directive provided inconsistencies between Member States’ laws, as each one could determine how the goals of the Directive were to be implemented.[7] Secondly, the Directive was ambiguous in its description of who outside of the EU was subject to the provisions and the Member State laws. This uncertainty was clarified by the GDPR as it implemented the policy that anyone who processes data of residents in the EU, regardless of whether or not they have an office in the EU, is subject to that regulation. By providing clarity, it allows individuals and companies to know whether they are bound by these laws. Additionally, the low ‘maximum fines’ permitted by Member State laws can be seen as another weakness. The GDPR addressed this by increasing the penalties for a violation of the regulation of up to 4% of annual global turnover or €20 million, whichever is greater.[8] As touched upon in the previous article, this increase is still minimal compared to the fines available in competition law where breaching activity can result in a fine of up to 10% of a company’s global worldwide turnover.[9] DLA Piper’s survey suggests that it is likely that regulators and courts will look to EU competition law for inspiration when calculating GDPR fines; however, there are mixed opinions as some agree while others believe that it will be against the GDPR’s provisions.[10]

Another crucial development was the expansion and clarification of ‘personal data’ by the GDPR to cover advances in what machines may be able to collect data in the future.[11] The DPA had a restrictive definition which resulted in practical consequences due to the wide variation of interpretation across the EU. The minimisation of the scope of personal data was abolished after the enactment of the GDPR and its clear definition. Article 4 gives examples of non-nymic personal data as including an ‘identification number’ and one or more factors specific to their physical, physiological, genetic, mental, economic, cultural or social identity’.[12] Photographs are also governed under this requirement so long as they are stored alongside an individual’s information. Overall, understanding what constitutes personal data is crucial as it is a threshold requirement for the application of any data protection rules in order to tackle the issue of improperly gathered data for political manipulation, as seen in the CA scandal.

The consent requirement

To ensure that the law is sufficiently protecting one’s data, it is important that holders of our data, such as social media companies, clearly understand the conditions under which they can use our data lawfully. Prior to the GDPR, consent had arguably become a magic wand that could be waived by any popular online service to secure itself a revenue stream of personal data whilst remaining legally compliant.[13] The regulation addressed this by setting out the conditions under which data can be lawfully processed, they are, consent, contract, legal obligation, vital interests, public task and legitimate interests, only one of these need to apply.[14] Although, one could question whether it is sufficient for a data controller to justify his actions based on one ground, or whether there should be a minimum threshold. Consent is increasingly failing as an effective threshold condition in a world where users regularly rubberstamp standard term contracts to access online services, effectively giving away their data without meaningful consideration, negotiation or safeguards.[15] Surveys demonstrate that only 11 percent of UK citizens trust social media companies with the use of their data.[16] In hoping for positive change, the GDPR requires that consent to data sharing is not bundled with other matters in a contract, it should be clearly distinguishable.[17]

It is clear that as consumers we should be reading the terms and conditions of our user agreements more closely, however, a large proportion of us are guilty of not doing so as most of the time we have no choice; we must be willing to accept the terms in exchange for access to the website. The Deloitte survey supports this by discovering that 91 percent of consumers accept legal terms and conditions without reading them, this is even higher for younger people aged 18-34, amounting to 97 percent.[18] This can be extremely dangerous as without consciously realising, one may consent to the use of their data for ways which they would not usually agree to, in particular regarding political manipulation. The GDPR expands the consent requirement by requiring proof of consent for individual’s over the age of 16, ‘by a statement or a clear affirmative action’.[19] This is an evolution from the Directive which only required the user to signify agreement.[20] Silence, inactivity and pre-checked boxes do not meet the requirement of affirmative consent under GDPR and are banned.[21] This is further supported by Article 5(2) which introduces a seventh principle of accountability; it places a burden of proof on businesses to demonstrate their compliance with the GDPR provisions, but it also aims to promote substantive compliance, which is not simply a mere box-ticking exercise.[22] This development should ensure that companies who collect, process and store data subjects’ information should be more transparent with the process by using clear and unambiguous terms to ensure that users are not deceived. This ought to increase a user’s trust towards the platform, in particular, Facebook following the scandals formerly discussed.

Reform suggestions for the GDPR

Research shows that the regulation has helped to tame the behaviour of the tech giants.[23] However, even with laws and policies in place, the potential for data misuse is growing, with employees and third-party contractors being the most common perpetrators.[24] The regulation failed to convince users that they have more control over their data as six months after going into effect, EU consumers’ trust in the internet was at its lowest in a decade.[25] There needs to be cooperation from each Member State to ensure that the legislation is obeyed. This could be guaranteed through the allocation of appropriate financial and human resources to data protection authorities and carrying out routine checks on companies to ensure their compliance with the regulation. If any breaches are present, they can be fined accordingly. Other potential reforms could include the establishment of an unambiguous framework which will allow companies to be judged against, to ensure that they have upheld the necessary requirements for processing data. This reform will also ensure harmonisation of data protection laws, which is one of the GDPR’s key aims. A further proposal is to produce an objective framework which would set the standard that political advertisements must meet before publication, to ensure transparency. The implementation of a statutory code of practice which highlights the use of personal information in political campaigns could be a significant reform which would provide greater transparency to users. These are just some ideas which could act as a stepping-stone to improving the effectiveness of the GPDR so that it adequately protects the user’s personal data from being misused to manipulate political votes.

In conclusion, the CA scandal demonstrated that the regulations in place were not strict enough to discourage companies from disobeying the relevant policies. This led to a huge decline of trust in users. The implementation of the GDPR addressed these issues intending to provide individuals with stronger data protection rights over the use and processing of their data. As two years have passed since its implementation, it has been evident that despite some positive changes, the developments are not sufficient enough. This justifies the need for a reform to tackle the unresolved issues as briefly suggested.


[1]Charlotte Rogers, ‘If Facebook Was Really Serious About Change It Would Cut The Political Ads’ (Marketing Week, 2019) https://www.marketingweek.com/facebook-political-ads/

[2]‘Cambridge Analytica, GDPR - 1 Year On - A Lot Of Words And Some Action’ (Privacy International, 2019) https://privacyinternational.org/news-analysis/2857/cambridge-analytica-gdpr-1-year-lot-words-and-some-action

[3]Bernd Schmidt, ‘The Applicability Of The GDPR Within The EEA - Technically Legal’ (Technically Legal, 2018) https://planit.legal/blog/en/the-applicability-of-the-gdpr-within-the-eea/

[4]Pierro A Bonatti and Sabrina Kirrane, ‘Big Data And Analytics In The Age Of The GDPR’ (IEEE Big Data Congress Computer Society 2019) https://epub.wu.ac.at/7007/1/IEEE-Services-19-SPECIAL.pdf

[5]Leslie Picker and Nick Wells, ‘Anonymous’ Data Might Not Be So Anonymous, Study Shows’ (CNBC, 2019) https://www.cnbc.com/2019/07/23/anonymous-data-might-not-be-so-anonymous-study-shows.html

[6]Houser, Kimberly and Voss, W. Gregory, GDPR: The End of Google and Facebook or a New Paradigm in Data Privacy? (July 11, 2018) <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3212210>

[7]Neil Robinson and others, Review Of The European Data Protection Directive(RAND Corporation 2019) <https://www.rand.org/pubs/technical_reports/TR710.html>

[8]‘GDPR Penalties And Fines’ (Itgovernance.co.uk) https://www.itgovernance.co.uk/dpa-and-gdpr-penalties

[9]Quick Guide To Complying With Competition Law (4th edn, Competition & Markets Authority 2014) https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/306899/CMA19.pdf

[10]DLA Piper, 'DLA Piper GDPR Data Breach Survey (2019) <https://www.dlapiper.com/~/media/files/insights/publications/2019/02/dla-piper-gdpr-data-breach-survey-february-2019.pdf>

[11]Houser, Kimberly and Voss, W. Gregory, GDPR: The End of Google and Facebook or a New Paradigm in Data Privacy? (July 11, 2018) <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3212210>

[12]GDPR, (n 3), Article 4

[13]Lilian Edwards, Law, Policy And The Internet (1st edn, Hart Publishing 2019)

[14]GDPR, (n 3), Article 6(1)

[15]Lilian Edwards, Law, Policy And The Internet (1st edn, Hart Publishing 2019)

[16]Carolyn Black, Lucy Setterfield and Rachel Warren, Online Data Privacy From Attitudes To Action (Ipsos MORI Scotland for Carnegie UK Trust 2018) https://d1ssu070pg2v9i.cloudfront.net/pex/carnegie_uk_trust/2018/08/03110116/Online-Data-Privacy-from-Attitudes-to-Action-CUKT.pdf

[17]GDPR, (n 3), Article 7(2)

[18]Caroline Cakebread, ‘You’re Not Alone, No One Reads Terms Of Service Agreements’ (Business Insider, 2017) https://www.businessinsider.com/deloitte-study-91-percent-agree-terms-of-service-without-reading-2017-11?r=US&IR=T

[19]GDPR, (n 3), Article 4

[20]Directive 95/46/EC (n 71), Article 2(h), 7(a)

[21]GDPR, (n 3), Recital 32

[22]Guide To The General Data Protection Regulation (Information Commissioner’s Office 2018) https://ico.org.uk/media/for-organisations/guide-to-the-general-data-protection-regulation-gdpr-1-0.pdf

[23]Steve Ranger, ‘GDPR Proves That Tech Giants Can Be Tamed’ (ZDNet, 2018) https://www.zdnet.com/article/gdpr-is-already-a-success-whether-you-like-it-or-not/

[24]‘5 Examples Of Data & Information Misuse’ (ObserveIT, 2018) <https://www.observeit.com/blog/importance-data-misuse-prevention-and-detection/>

[25]Eline Chivot, ‘One Year On, GDPR Needs A Reality Check’ (Financial Times, 2019) https://www.ft.com/content/26ee4f7c-982d-11e9-98b9-e38c177b152f