Impact of GDPR on data protection and privacy
1 Apr 2018
Ryan Nazareth

The General Data Protection Regulations (GDPR) come into force on 25 May 2018, replacing the current Data Protection Act, and will transform the way organisations store and process personally identifiable data. One of the most important developments is the strengthening of a number of 'data rights' for individuals who will be able to access their data free of charge and have the ablity to update their data and request their data to be deleted ('right to be forgotten'). GDPR also broadens the definition of 'personal data' to include online identifiers such as IP addresses, mobile device and biometric data.

We have all experienced the ambigous, small print terms and conditions or a random 'I agree' statement that pops up on our screen when navigating websites. It is not always obvious how personal data is being used or what additional information like cookies or digital data is being collected and stored. The vagueness surrounding the application, extraction and explotation of data by third parties is where the line between legality and morality start to be blurred (of interest this topic is covered by an episode of Moral Maze on radio 4).  This is where GDPR directly sets out to tackle this lack of transparency. Companies will have to inform individuals in a much more transparent way for what purposes and timeframe their data will be used. Failure to comply under GDPR can result in more severe punishments compared to the current Data Protection Act, with fines of up to €20 million or 4% of global annual turnover (whichever is higher) for serious infringements.  

 

Recent data breaches

In July 2017, the Information Commissioner's Office (ICO) ruled that London's Royal Free NHS Foundation Trust failed to comply with the Data Protection Act when it transferred data of 1.6 million patients to Google's artificial intelligence arm DeepMind as part of a trial to create a healthcare app 'Streams', which would alert doctors if patients were at risk from a condition called acute kidney injury. Patients were not adequately informed that their data would be used as part of the test and hence violated the Data Protection Act's principle of 'data being used for a specific purpose'. The ICO's ruling was that the processing of data was neither fair nor transparent. 

In 2014, Facebook had invited users to find out their personality type by taking part in a quiz on an app 'This is Your Digital Life'. Users also allowed the app to access data from their Facebook profiles, which also acquired information about their likes and behaviours. Data from 270,000 users were collected but the app also had access to the data of test-takers' Facebook friends, thereby collecting sensitive data from 50 million users. In 2015, the data was then sold on to Cambridge Analytica, a data mining consulting firm, who matched the answers to the personality test and information about user likes and behaviours to build psychological profiles of users. This enabled the company to generate more targeted political advertising campaigns based on user personality type, with a view to influencing votes during the US Election and 2016 Brexit Referendum. By selling the app data to a third party and processing the data in this way, the Data Protection Act was violated, resulting in Cambridge Analytica being suspended and all the data being deleted following an investigation by ICO in March 2018.  

 

What would happen under GDPR regulations ?

In the case of the Cambridge Analytica/Facebook scandal, any data that was processed from the time the app was launched till now comes under the Data Protection Act. If the data is processed after 25 May 2018 then GDPR would come into play which imposes much stricter regulations in terms of transparency, definition of purpose and consent. The distinction between an organisation that holds the data (termed 'data controller' under GDPR) and those that access/process the data (termed 'data processor' under GDPR) is made more transparent, which would make it easier to investigate who directly assessed and controlled the data. For example Facebook would be a first party controller. The 'right to be forgotten' would also allow users to erase their data and companies controlling or processing data would need to request data processing consent from users in a clear and understandable manner.

The Cambridge Analytica/Facebook scandal proves that consumer's personal data is becoming more vulnerable than we think and can be open to abuse in ways in which we are not aware of. GDPR is designed to prevent this sort of abuse by imposing hefty fines and making the concept of data protection a lot more transparent for the layman by demystifying the confusion jargon of how data is processed, passed on and often misused. Implementation of these changes will hopefully affect people's lives for the better.


Ryan is a Data Scientist at Manta Ray Media. He has a keen interest in natural language processing, big data analytics and visualisation applications | Twitter: @rkn0386