Skip to main content

Cookies, pop-up, GDPR, DPO … Protection of personal data and informed consent

This post was co-authored by Deborah Liebart (fellow of SCImPULSE Foundation, and late Data Protection Officer of the European Society of Athletic Therapy and Training) and by Dr Salvatore Cognetti (fellow of SCImPULSE Foundation, and late Medical Liaison Program director of the European Society of Athletic Therapy and Training). It first appeared on DisputatioMagistrorum, and the blog of ESATT. It is associated to DOI: 10.5281/zenodo.4081241and it is licensed under CC BY 4.0

At a time when, albeit understandably so in the current context, going to a restaurant in Europe involves giving our contact details, and inevitably when new actors are authorized (or even requested) new actions in the data ecosystem we receive news of those data being monetized beyond their original scopes, it is necessary to think about how that data is being collected, processed, and archived. Therefore, I would like to spare a few words on the text that is at the foundation of data collection practice in Europe: The General Data Protection Regulations (GDPR). We hear about it sporadically, but know far too little about it, and its application by data collectors is still far from perfect1.

This image is licensed under CC BY 2.0; the creator was Owen Moore


So, what is the GDPR and what does it really tell us about the society in which we are evolving today ?

    • What is the GDPR ?

The GDPR is a piece of European legislation, adopted by the European Parliament in April 2016, which regulates business and institutional actors’ collection and use of collected data, in an « open-world » where new technologies individual profiling is made possible, and allows for the creation and “offer” of targeted marketing and personalised advertisements. This is done by cross-checking different sources of information such as search keywords, geolocation, travel data, as well as personal data such as age, gender, biometric information, place of birth and of residence, recent purchases, etc.) An example of this is the “cookie”2, the technology behind the ubiquitous pop-ups that peddle, say, baby bottles as soon as you’ve searched Google for maternity hospitals close to your home (which, by the way, will statistically tend to also profile you as female…)

So, how can it be ensured that this personal data, about the private life of users that the legislator seeks to protect, will not end up being used by unscrupulous operators who create gigantic databases in order to profile anyone as a potential customer, to then use or sell that data for their own profit? Equally, how can it be ensured that this data is not used by a totalitarian regime to profile and arrest an individual for any arbitrary reason such as religion or sexual orientation?

Acknowledging the challenge posed by this unprecedented volume of data flow, the European Parliament, through the GDPR, creates a framework for the use of the collected data, with the aim of protecting people’s physical and moral integrity, both within and outwith the Union, as the transfer of data both into and out of the Union has become significant.

By regulating this daily, capillary traffic, the Parliament insists on the need to, above all, protect personal data, and recognises this protection as a fundamental right of people, irrespective of their nationality and country of residence. It aims to create a zone of freedom, security, and justice, and to empower the economies within the internal market, by regulating these practices with strict rules, which are defined by the European and National laws.3

While the principle may seem clear, what does this mean in practice, then, for companies? And furthermore, what impact does it have on their development and operations? Every possible kind of company is concerned, from SME to multinational corporations, as companies are now all bound by Law to protect their customers’ data.

With the GDPR, Parliament draws the border of the individual’s right to freely dispose of his or her data, whether it is data concerning them as a person, but also of their daily activity in the online, dematerialised virtual world, which used to be disconnected from the Law and legal challenges. Indeed, it used to be difficult to track a source or access one’s own personal data — and even more so, modify it once it had fallen into a third party’s hands. Technically, it is this multitude pop-ups that block you from accessing websites and ask you to consent to the collection of your data for various purposes… (and sometimes even shut you out of the website unless that consent is given.) The very same pop-ups that, sometimes because of weariness, and other times due to the sheer difficulty of refusing, we end up closing by clicking on “accept” or “accept all”, while ill informed about the importance of this invisible data collection, without even wondering what it is that’s at play behind the scenes… even though the GDPR defends the idea of a positive, informed consent, to be explained to the user by the collector or the institution, in accordance with ethical standards…

Behind these virtual windows, the fundamental values enshrined by the GDPR come into play: the right of a company to collect data, but above all the obligation to make their collection and consultation, and open to the user, in complete transparency, and yet secure. By insisting on the obligation to erase this data as quickly as possible after the necessary use, for which the user has given his or her consent, has been made, the GDPR reinforces not only the right to be forgotten, but above all, the user’s right to monitor the use that the collector makes of his or her data. It is for this purpose that companies are obliged to “pseudoanonymise” databases as much as possible, as well as to encrypt them for security reasons.

For corporations, this means hiring competent staff (whether internal, or externally subcontracted), knowing the legislation, and being able to apply it. Ideally, it would be applied with some prior knowledge of ethics, for what the GDPR protects and enshrines into Law are indeed respect for the privacy of individuals and their households, medical confidentiality; freedom of thought, of conscience, of religion, of expression, of information; entrepreneurship, and ultimately the right to access an impartial tribunal. In other words, a certain idea(l?) of democracy and civil rights, in the face of the ubiquitous emergence of new technologies in our day to day life.

Imagine, for example, a manufacturer of smart watches, capable of ECG technology and connected to the cloud, teaming up with a consortium of bankers or insurers… Or, how about a database that made it possible to profile people and predict their performance, behaviours, and attitudes in the professional world? Or finally, what if by cross-referencing data, medical records were to facilitate discriminatory hiring on the basis of disability, illness, past medical history, or even on the basis of misused genetic data, which could be used to stratify the risk of developing certain illnesses, etc…

In a time where public authorities have introduced anonymous CVs to prevent discrimination on the basis of someone’s name or home address, such efforts seem almost off-topic: It seems public authorities are late, always lagging behind the everyday reality of technologies, and the opportunities it creates, for good and for ill.

This, in essence, is is the purpose of GDPR, of confidentiality, of database encryption, and the strict supervision of legal representatives that companies or institutions must appoint — the Data Protection Officers — as mandated by independent supervisory authorities such as the Commission Nationale Informatique et Liberté in France, the Dutch Data Protection Authority in the Netherlands, or the Garante per la protezione dei dati personali in Italy…


    • What does this change for companies ?

Since May 2018, companies are required to have data controllers who are able to respond to users’ questions and requests regarding the use of their data, to monitor the use of data within the company and, working independently, discharge their duties to the Law and to the National and European institutions. The responsible data controller is also under the obligation to disclose any potential major infractions, in order to protect public safety.

All companies that encounter genuine difficulties are invited to engage with their country’s regulatory authority in order to obtain assistance with setting up a system that is adequate and adapted to their traffic intensity, keeping in mind that the solutions won’t be the same for SME and for multinationals, for example.

Consent must be obtained only after having undertaken a genuine effort to inform the user, clearly explaining the purpose of data collection, and undertaking to delete the data as quickly as possible thereafter. Denying consent to such data collection should be made as easy as giving it, hence the CNIL’s recourse, which aims to mandate the introduction of a “refuse all” button. This is what is called informed consent, or conscientious consent4.

Retained data must be secure, encrypted where possible, and available to the person whose data was given in an interoperable format, without limits or fees. The data protection officer must respond to users’ questions within a month, and also has the obligation to preview the appropriate mechanisms that are to be used to protect people’s fundamental freedoms and rights, without hampering society’s advancement of knowledge, in particular for the purposes of research, statistics, or archives. Every instance of hacking must be made known to the authorities no later than 72 hours after the discovery of such hacking, so as to prevent, given the sensitivity of the hacked data, the risks of theft, identity theft, and financial loss. These obligations could lead to the establishment of new collection techniques, but also to lawsuits if the rules are not respected.

It is apparent that, for medium sized companies with moderate levels of traffic (the large ones generally already have a competent service in charge of these issues), this new regulation implies drastic changes in their operations and self-governance, as much as the theoretical weight of the responsible data officer should be at the centre of the use of new technologies based on data collection and the return of client experience. The current legal actions (by the CNIL) against companies in recent times show how after two years (a delay judged to be necessary for the implementation of internal infrastructures), the commission is beginning to assert the right of users in a court of Law, be they customers or workers, such as in the Uber and H&M cases5. It is therefore an urgent priority for companies that may not have yet done so, to become GDPR compliant or face legal action from National regulators.

The second question concerns the way in which a user may assert their rights — or in other words, how to avoid falling in the trap of unethical data collectors.


    • How to assert one’s rights as a user?

First of all, by making enquiries about the use and purpose of the data collection. Just as nobody would sign a blank cheque, or leave the door open when leaving their home, we should realise how blindly accepting these practices means doing exactly that: opening our private lives and placing them within reach of actors who are in violation of fundamental national and international laws, by letting them access our applications and our most private data.

Secondly, if after having made contact with the data protection officer, access to data remains impossible, without a response within the legal timeframe, one can take the matter directly to the national supervisory authority and the European Courts, whether individually or as a class action, so that they can intervene and enforce the regulations and the GDPR in particular, as the supremacy of EU law over national law has been recognised for decades now.

Finally, it is important to note once again that, as harsh as the GDPR may seem to be, it is a tool for freedom that regulates real, everyday practices, and seeks to prevent rogue data collectors from making a profit out of legal loopholes in order to ultimately sell the collected data on to the highest bidder whose aim is to target consumers and potential clients more and more precisely. Beyond this, it reaffirms — in theory, at least — the fundamental rights of the individual, by extending the principles of real life society to the virtual world, making the Internet a space that is subject to the Law, and giving individuals the chance and the means of react against abuse, defending not only the integrity of their physical and moral personhood, but also their freedom to consent or not, ultimately causing the virtual world to submit to the real world’s democratic values. Being reminded of these rights, and of the need to not take them for granted in new info-ecosystems, and to always be alert when new rights and duties are being discussed, for example, for AI. We, the citizens of Europe, have to stand unite and informed, because our political leaders need our guidance in mapping our values and regulating our lives, for us to thrive together, and tomorrow our children.

3 C.f. GDPR, art.2.

Comments