Share on facebook
Share on linkedin
Share on twitter
As part of an international campaign to lift the lid on data privacy violations, The Privacy Collective is asking some of Europe’s leading experts why online privacy matters.
Sophie in ’t Veld has been a Dutch member of the European parliament for the social liberal party Democrats 66 since 2014. She’s also a member of Europe’s civil liberties, justice and home affairs committee and sits on the advisory board of Privacy International. Here, she discusses with The Privacy Collective about the growth of public awareness around the importance of data privacy, GDPR’s enforcement problem, and why mergers between companies like Google and Fitbit should worry everyone.
Why does online privacy matter?
When I started to work on these issues in 2004, I got this question all the time. But in recent years the awareness of the importance of privacy and data protection is higher, particularly when it comes to the commercial use of personal data.
Maybe the thing that people don’t sufficiently understand yet is what privacy and data protection actually mean, and what’s behind those abstract terms. I’m beginning to be a firm believer in the principle of privacy by disaster – it takes something going wrong for people to realise the importance of this. There’s almost a cognitive dissonance between hearing about the possible misuse of personal data via scandals such as Cambridge Analytica, and people connecting it to their own daily lives. It’s not even necessarily about what these companies know about you, or the fact that they’re storing this data. It’s the fact that they can then use it to steer your behaviour. That is something which completely escapes most people.
What role is the Netherlands specifically playing in the fight back against technology monopolies and their handling of private customer data?
There was a big scandal in the Netherlands recently, where the tax authorities were found to have been using personal data for ethnic profiling – people were accused of being tax evaders or fraudsters based on their citizenship status. It has raised awareness of some of these risks among the public. But I don’t think the Netherlands is leading on this issue. Awareness is much stronger in countries like Germany, for example, or some Eastern European countries.
A big blind spot that exists across Europe more generally is the use of personal data by law enforcement authorities and security agencies. People tend to trust these organisations or at least not feel like this is an urgent problem. But I think it might actually become one of the biggest privacy and data protection issues in years to come.
In 2018, you described the Cambridge Analytica scandal as a ‘wake up call’ for Europe. Do you think enough has been done since to make sure this type of violation of personal data doesn’t happen again? And if not, what needs to be done?
I almost feel as if the alarm clock has been pushed back to snooze mode until the next wake up call. Not enough is being done. There’s a disconnect at the Commission where on the one hand, they’re going bananas over disinformation and fake news and election interference, with talk about Russia and China and Saudi Arabia. But on the other hand, they’re completely blind to the fact that a lot of this is actually being propagated by European leaders, in places like Hungary and Poland, and by the United States. I don’t know how they can say two contradictory things at the same time. Cambridge Analytica was a wake up call to the fact that these things can happen. It’s not changed the fact that the Commission, but also Council and Parliament, just don’t know how to deal with the forces at play within our own democracy. I think that is dangerous.
I almost feel as if the alarm clock has been pushed back to snooze mode until the next wake up call. Not enough is being done.
There are some critics who say GDPR, while good legislation, has an enforcement problem. Would you agree, and what needs to be done to close that gap?
GDPR is the best privacy law in the world, but enforcement has been pathetic in every respect. The national governments are not allocating the means and the powers they are legally obliged to. The National Data Protection Authorities (DPAs) and the European Data Protection Board (EDPB) are too timid for my taste. The fact that we have to rely on a single DPA in Ireland and another in Luxembourg – two gatekeepers for the entire European Union is absurd. We’re now seeing more private enforcement of these issues, and although I wholeheartedly welcome it, we cannot rely on that.
The European Parliament should be a lot more assertive too. We are letting the Commission get away with not doing anything. It’s really a dereliction of duty, the fact that the Commission is too chicken to go against the member states or stand up to the American Government. It’s very cynically counting on private citizens to go to court and litigate for five or six years instead.
You’ve recently filed a question to the Commission about a planned merger between Google and Fitbit. What concerns you about this proposal? What cautionary lessons do you think should have been taken from the Facebook/WhatsApp merger in 2014?
We say that GDPR is the best law ever made by the European Union but at the same time, it’s considered to be of secondary importance when we are legislating on economic issues. Mergers are looked at as a competition issue, rather than considering the human rights implications. Facebook said at the time of that merger: “Scout’s honour, we’re not going to use Whatsapp’s data”. Then they did. Similarly, I remember in 2008 when there was the merger between Google and DoubleClick, we were screaming blue murder to the Commission, saying, “You can’t just let this go through” and the Commission said “No, we’ve investigated and they’re two completely different companies. Google is a search engine, and DoubleClick is an ad company. So they’re not competing”.
It’s taken 12 years for the Commission to now begin to consider both competition issues and personal data issues. The concern with the Fitbit and Google proposal is the big tech companies now own or hold the biggest amount of medical data in the world. This is very sensitive information. And these are not European companies, even if they’re household names. The functioning of our societies, the functioning of our democracies depends entirely on American companies. Don’t we think that’s scary? When it comes to defense and raw materials, we talk about the importance of strategic autonomy – we say we don’t want to depend on Russia, for gas and Saudi Arabia for oil. We want to be autonomous. That’s partly why we’re investing in renewable energies. With technology? No.
What can individuals do today to educate themselves and better protect their data online?
I believe it’s a matter of being aware, talking about it with others, and asking questions. Wherever I go, if people ask me to leave my data behind I always ask why and I often protest. People are always surprised and a bit irritated that I’m being so difficult but many of them change their practices. It helps people understand that it’s not ok to ask for so much personal information.
Secondly, from a political perspective, we’ve seen a lot of activism when it comes to climate change or foreign policies, against populism or for populism. People are mobilised but they’re not very mobilised on this issue. I think they should be. The younger generation is very active, but the one thing they don’t do is they don’t vote. Parliaments and governments are the ones making the rules so I find that difficult to understand. If young people are mobilised enough to take to the streets, then why do they leave it to others to determine what the government will look like? Look at what happened in the United States. Every single vote counts. Don’t leave it to chance. Vote.