The Facebook – Cambridge Analytica scandal under the new EU data privacy rules
By Teodor Teofilov
The Facebook-Cambridge Analytica Scandal
The Facebook-Cambridge Analytica scandal didn’t reach the public until March 2018, when reports from the New York Times and the Guardian revealed that personal information was taken without permission in early 2014 and used to create a system to profile US voters and target them with personalized advertisements.
Cambridge Analytica (CA) wanted to get information of Facebook users to build psychological profiles on a large part of the US electorate. CA was able to gather the database swiftly with a personality test app on the social platform. It collected data from nearly 80 million US voters by gathering the data of the people that did the test and their Facebook friends, even those that didn’t download the app. Since then Facebook has improved its privacy restrictions whereas CA has denied any wrongdoing.
Cambridge Analytica is a private British political consulting company that was formed in 2013 and combines data mining, data brokerage and data analysis with strategic communication for elections.
The European Union General Data Protection Regulation was voted in April 2016 and came into effect in May 2018.
If the new data privacy regulation of the EU was effective during Cambridge Analytica scandal, what would have been the repercussions to Facebook?
What is the General Data Protection Regulation in the EU?
The radical overhaul of the data protection laws in the European Union came on May 25, 2018, and companies set in or working within the EU’s borders had two years to prepare for these changes and the compliance problems that it brings.
Smartphones, the Internet, and new digital technologies transformed the way data is collected and handled and the current legislation is out of date. The sheer volume of personal data online is unfathomable. The General Data Protection Regulation (GDPR) is the EU’s attempt to harmonize data protection laws across Europe and it is a radical change as affects all businesses processing (collect, record, use, or disclose) data relating to an identified or identifiable natural person (personal data).
The GDPR is an inescapable piece of legislation that companies have to take into consideration going forward. The need for compliance cannot be escaped since they frequently process personal data as part of their business activities, hold personal data of their employees and possibly use that date for marketing purposes. Any company that holds and processes data about citizens of the EU will have to comply – no matter if they aren’t based in the Union.
If companies don’t comply with the GDPR there are severe fines in place. The maximum penalty for noncompliance could reach €20 million or 4% of annual turnover – whichever is higher. So no room for complacency. Then what were the necessary preparations they needed to make in order to assimilate the new rules and minimize the risk of incurring such fines and possible damage to their reputation?
Under the new regulation, companies are required to comply with the law and have records to demonstrate it. As such, they need to have introduced a compliance program that will put in place a set of policies, procedures and audit controls with which to monitor and ensure compliance. For such a program to be successful it may require all areas of the business to work together in order to raise awareness of the new regime and its possible impact on daily business and to aid risk assessments and record keeping. The new regulation prescribes the way data is captured and used, meaning that companies had to undergo a detailed review of their personal data processing activities.
A vital step that companies had to undertake was to assess the legal basis for processing personal data – consent, legitimate interest, compliance with the law, perform a contract etc. Once the basis is there, companies need to keep a record of it. If companies are relying on consent from individuals to process their personal data, they need to meet a higher standard than previously. The consent needs to be informed, specific, freely given, clear and revocable. Under GDPR there are no more pre-ticked boxes, silence or inactivity. The norms that existed previously won’t meet the new standard.
“We are working hard to prepare for the EU’s General Data Protection Regulation (GDPR),” says Google. “Keeping users’ information safe and secure is among our highest priorities at Google.”
The regulation introduced a new requirement for transparency, meaning that companies have to be open about the way they process personal data. All individuals that companies process information about have to be informed about what information is held about them, how it is used and who it is shared with. Privacy notices, under the GDPR, are required to provide a greater level of information and have to be far more specific and granular. Companies should take note that one of the most prominent new requirements is that privacy notices have to detail the legal basis for processing the personal data. Many organizations had to update their current privacy notices.
Companies also have to abide by the “right to be forgotten” provision in the GDPR. Individuals now have a right to require firms to erase their personal data. This can happen if the data is no longer necessary for the purpose it was collected for or if the individual withdraws consent. It is possible for companies to reject a request to be forgotten if the data is necessary for establishing, exercising or defending a legal claim, or where it is required by law for the data to be kept. It is important for companies to consider the circumstances of rejecting such requests and working out how to give effect to any request. In practicality, this requires companies to review retention practices in general as data shouldn’t be kept longer than necessary.
Many organizations store the personal data of clients and employees and under the new regulation need to consider how consent was given for processing purposes and recognize that silence or tick boxes aren’t considered as consent following May 25. New standard templates for obtaining consent for marketing purposes need to be established, clearly explaining how the data will be used and for how long it will be stored. For employee data, organizations must find the most appropriate legal basis for processing personal data and carry out periodic reviews to remove data that isn’t required on former or prospective employees.
The GDPR introduced new rules on data security and potential breaches that require them to be reported within 72 hours to the local data governing body. In some cases, it is required to inform the individuals that have been affected by it. Companies also need to inform and educate its employees on the new regulation and regular training on the GDPR should be given to all staff so they know the implications of noncompliance. Smaller companies won’t have the resources that Facebook and Google have in being able to hire a whole team to deal with the GDPR.
At times, when dealing with third-parties, companies have to make sure that the third-party complies with the GDPR before sending personal data to them.
With the GDPR now in force, companies need to ensure they comply with the new regulation. For some, it’s a matter of identifying what measures already exist, what steps need to be taken to comply with the new regulation and to fill the gaps. The legislation has already come to fruition on May 25, 2018, and companies shouldn’t be complacent. The fines can be crippling to a business and there is a serious risk of damage to a company’s reputation if they don’t comply. Preparation over the last two years was key for companies if they wanted to avoid losing clients.
The Facebook-Cambridge Analytica under the GDPR
“From Facebook’s perspective,” MacRoberts LLP senior partner David Flint told SecurityWeek before the GDPR came into force, “the only good point is that the maximum fine under the [current UK] Data Protection Act is £500,000; after 25 May 2018 it would be 4 percent of Facebook worldwide turnover ($40bn in 2017) — a potential $1.6bn fine! That’s before damages claims.”
However, it isn’t as clear-cut as Flint puts it. The common perception is that the GDPR is designed to protect EU citizens. Recital 14 states: “The protection afforded by this Regulation should apply to natural persons, whatever their nationality or place of residence, in relation to the processing of their personal data. ”
Although this may mean that persons outside of the EU could be covered to some extent by the GDPR, ultimately interpretation of the details will be for the courts to decide. It isn’t clear currently whether this part includes nationalities outside of the EU.
The GDPR could apply to the Facebook-Cambridge Analytica scandal on the basis of Article 3.1, which concerns Data Controller/Processor based in the EU. It states that “This Regulation applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in the Union, regardless of whether the processing takes place in the Union or not.”
Under the new EU privacy rules, the responsibility lies mainly with the data controller and it can’t be pushed to the data processor. There is no doubt that CA as a UK based company processing data from a Facebook that operates within the EU would be liable under the GDPR. The main issue in this would be the consent problem. CA will argue that by downloading and installing the app users gave consent for their data to be used and shared and that allowing their date to be used among Facebook friends, the friends also gave their consent.
This doesn’t work under the GDPR.
The new privacy rules in Europe state that “the data subject’s consent shall mean any freely given specific and informed indication of his wishes by which the data subject signifies his agreement to personal data relating to him being processed.” The people that downloaded the app are unlikely to have given their informed consent for their personal data to be used for the political purposes of the US presidential election.
Depending on the court’s interpretation of the GDPR it is hard to determine if Facebook or Cambridge Analytica would be liable under the GDPR. However, if there was even one user that was living in the EU at the time, the EU data privacy regulation will be in full effect and the companies will see fines of either €20 million or 4% of annual turnover.