Almost three years ago, on May 25, 2018, the General Data Protection Regulation (GDPR) came into force, and since then politicians and data protection experts in Europe have been patting each other on the back about how successful GDPR is on the whole and how much it stands as a counterbalance to the liberal or even non-existent data protection laws or regulations in the US or China. One piece of evidence cited is that now even Silicon Valley companies like Apple are citing European data protection (GDPR) as a model and promoting it in their own countries as a proposal for sensible regulation. Further proof for many is how European companies can now position themselves as data protection professionals and export this expertise around the world.
The fight against Google & Co
This is an argument that leaves a very stale taste in the mouth when you consider that the motivation behind the GDPR was initially directed against the digital giants from the USA (and to a lesser extent China). The intention was to make it more difficult for the Googles, Facebooks, Amazons or Microsofts to collect data and to make data collection more transparent and controllable for end users. Also, Europe, and specifically the German-speaking world, has had some bad experiences in the past. Two ideologies that came from our cultural sphere committed massive data protection violations against their citizens and thus were able to wreak their murderous havoc in the first place. It is understandable that there is a particular sensitivity to the collection and use of data here.
But this noble motive had the opposite effect in some cases. It was not the American digital giants that were stopped, but the many small and a few larger European companies. These companies are not only overburdened with the task of understanding the maze of laws, but also do not have the resources to comply with them. What three-person startup can afford its own data protection officer? Even large organizations can hardly find their way around it. For example, just a few months after the GDPR came into effect, news made the rounds that Austrian property manager Wiener Wohnen, which manages 220,000 apartments, had begun replacing all the nameplates on its doorbells so that tenants’ names would not be visible, following a complaint from a tenant. The department of the City of Vienna responsible for data protection matters had interpreted the GDPR in this way and recommended this course of action to the property manager so as not to expose itself to possible lawsuits for data protection violations.
The Sword of Damocles
This sword of Damocles of a lawsuit hangs threateningly over every company, no matter how small or large, because somewhere a cookie consent or a checkbox might be set incorrectly by default. Although the European data protection authorities have described the City of Vienna’s interpretation as incorrect, the confusion and uncertainty remain. After all, if not even the data protection experts in Europe agree, and only an adjudication in court can bring certainty, who will still get involved?
Ignored and ignoramus
In any case, it doesn’t stop companies that don’t have a branch in the EU from ignoring the GDPR, with the best consent of European users who would rather use the applications than do without them. The best example is Clubhouse. Not even a year old, the app was literally overrun with German-speaking users in January 2021, and this despite the fact that the app was so against all privacy rules you could imagine. Contact data was uploaded to address books by new users without their consent, the data and audio recordings were stored in the US, and Clubhouse’s data was scanned by external parties.

While Clubhouse and other companies care little about this, users, on the other hand, toil through countless cookie windows. Every single – and I do mean every crappy – European website pushes message windows at you on your first visit about which cookies are being used, what they’re for, and which ones users want to accept before you can finally view the damn site. This is definitely not normal. Just like the European law that requires it. It shows technological ignorance on the part of European regulators, which seems all the more depressing when you consider that even technologically primitive website cookies that say little about individuals are not understood there. What about regulating the safety of nuclear power plants or artificial intelligence, which the Europeans are tackling right now? The worst is to be feared.
The fairy tale of personal data
At the same time, every supporter of the GDPR points out that the data protection laws would, after all, only apply to personal data. The protection of and control over the individual user’s data should be guaranteed for them and placed in their hands. If I don’t want someone to store my phone number or last visited website, they should not be allowed to do so. But it already gets more difficult when it comes to protecting metadata. To what extent do these not fall under it, or do they already fall under it? It turns out that this metadata, which describes the actual data, can often be used to draw conclusions about individual users from anonymized data. Netflix had learned this the hard way when it anonymized and published a dataset of rented movies from 500,000 users for research purposes in 2007. But with little effort, the researchers were able to reassign the data sets to individual users after all.
But what about machine data in production processes that, according to the experts, are not personal? If you believe the data protection experts, this is completely unproblematic.
Not so fast, because all machine data are created by people. They can also be used to draw conclusions about people. Namely, the time when the machine was switched on corresponds with the shift schedule of the employees present. Which mechanic serviced the machine that has now run into a problem can also be determined. Who put the part in the particular car that was manufactured, that too can be traced. And as soon as the works council and the unions get wind of this, the data collection is over. Because if a manager can draw conclusions from the data as to which employee an incident can be attributed to, then this can find its way into the employee’s performance appraisal and thus salary negotiations, or in the most extreme case lead to dismissal. And this means that all data, without exception, is also personal data.
Digital lag due to uncertainty
An engineer in charge of the software and data collection puts her at odds with her own data protection officers and legal department. If she makes the mistake of running her data collection on this machine through privacy and legal review, she realizes that it is best not to ever record any data of any kind. The process of determining that is so time-consuming and at the same time so unhelpful that you want to capitulate right away.
After all, anyone who has ever had to deal with these two internal bodies knows that it is mainly indications of danger that come to mind. Clearly, their task is to avoid risk for the company. And at the same time, their own. If the company is actually sued for data protection violations, this not only damages its reputation, but can also be quite expensive.
However, economic activities, research, product development or innovation are always fraught with risks. This is in the nature of things. Only a company that stops all activities has no risk (almost). Only: the risk does not remain with the data protection expert or the legal department, but with the management and the female engineers. And the latter are the pawns should something go wrong.
Chill-Effect
This chill effect leads to data acquisition censorship by the engineers. If one is unclear, one prefers not to record the data. And not even these, where actually everything should be safe. The effect for the domestic enterprises, the economy and innovation strength are not to be underestimated. No or only insufficient data is collected. These few, sporadic data sets are also hardly ever shared or merged with other data. Then again, because you have hardly any usable data sets, current technologies such as machine learning cannot be fed with sufficient data and trained. Mastering or even further developing such technologies becomes more difficult. And if these basic requirements are already lacking, less in the way of breakthroughs in new algorithms, applications or insights is possible.
It’s a vicious circle. Data protection is presented to data use. Data from which – without consuming it – valuable knowledge can be extracted is locked away in the vault of data protection law. Instead of creating wealth from data, we are proud of the vault in which, in the cold darkness, the data and thus our economic space wither away.
After only three years of the GDPR, it is time to adapt it to reality. Because without a change, not only will a cold shiver run down the spine of those who are somehow supposed to do ‘something with data’, but Europe as a whole will remain frozen on the technological track.