In het nieuws

Digital intrusions are a danger to our mental health, says Julia Keseru from The Engine Room

op 24 november 2020

NL

Share on facebook
Share on linkedin
Share on twitter

As part of an international campaign to lift the lid on data privacy violations, The Privacy Collective is asking some of Europe’s leading experts why online privacy matters. 

Julia Keseru is the Executive Director of The Engine Room, an activist working in the intersections of technology, power and social justice. Here she talks to The Privacy Collective about the psychological impact of everyday digital intrusions, why technology is a double edged sword, and the need to challenge our oppressive digital systems.

Why does online privacy matter? 

Online privacy matters for a number of different reasons – which are pretty much the same reasons why we value privacy in our physical reality. One thing that I think a lot about these days is what I call the unwanted touch of the digital era. Everyday digital intrusions (like facial or emotion recognition) have an eerily similar impact on our mental wellbeing compared to physical violations of our bodies. Our sense of integrity as human beings is largely influenced by how much control we have over our privacy, and the information that is collected about us.

Right now, however, the debate around privacy is very much focused on law and ethics. And I really think that we're going to have to expand that into the impact on us as human beings and our mental wellbeing if we want to understand what is really happening to our societies, thanks to emerging technologies. 

Can you tell me a bit about your work with the Engine Room, and how you’ve seen data and technology being a double edge sword? 

The Engine Room team is based across 10 different countries over four different continents and in the past decade we have worked with more than 500 organisations on their data and technology strategies. We primarily support civil society – human rights defenders, anti-corruption activists, the racial justice movement, disability justice organisations, women's rights, LGBTQI rights, and climate justice activists – in their efforts to use technology and data in strategic, effective, and responsible ways. 

Technology can be a double-edged sword. We see that tech and data can create enormous opportunities for organisations to increase civil society’s impact – to find new sets of information, to improve impact through automation, to tell their story in better ways, to reach a larger audience, to organise more effectively, and more. On the other hand, we also often see civil society using technology in irresponsible or less strategic ways. So our goal is to help them understand the possibilities, opportunities, challenges and unintended consequences of using certain technology solutions. 

You wrote an interesting piece recently about everyday digital intrusions and why you think our current digital systems are built on coercive, controlling and predatory dynamics. Can we talk about that and why you think we accept that in the digital world, when we wouldn’t in the physical one?  

I think the reason why it's incredibly hard to translate that experience to the digital reality is because science is not yet able to account for what exactly is happening to us when our integrity is violated in this way. And so our entire approach to the digital world is so different from the norms that guide our physical reality – we basically allowed these coercive, controlling, surveillance fuelled, predatory systems to guide and govern our everyday lives. 

It is still really difficult for most of us not to think about the digital world as something really rational and isolated from our “human” or physical side, as opposed to seeing it as an extension of our current reality, something that is also deeply connected to our feelings and emotions. In their book Data Feminism, Catherine D’Ignazio and Lauren Klein argue that current approaches to how we think about data science and ethics, for instance, have a tendency to recreate the same patriarchal and oppressive systems we’ve fought against in our physical reality for many years. It’s almost as if we have to restart that fight in the digital world. 

The same is true for capitalism and the underlying business model behind the tech industry. Shoshana Zuboff writes about  “surveillance capitalism” and says in the past 20 years, we have allowed these dynamics – digital surveillance, behavioural modification, unregulated tech monopolies, non-transparent online advertising – to become the norm. I think we were a little bit naive in how we thought about the Internet’s revolutionary impact, and at the same time, we weren’t really prepared for the price we have to pay. All of that is happening at the expense of some of our fundamental freedoms.

I think we were a little bit naive in how we thought about the Internet’s revolutionary impact, and at the same time, we weren’t really prepared for the price we have to pay. It's all happening at the expense of some of our fundamental freedoms.

The Engine Room is a founding member of the Responsible Data community. What does responsible data use look like? What challenges are associated with this? 

There are some baseline things that we always do at the Engine Room and we encourage and recommend our partners to do too. Obviously privacy and data protection play a huge part, but responsible data isn’t only about that. It is also things like data minimisation – only collect data that is very necessary and does not put anyone at risk. Be very conscientious and cautious about how you share that data with others. Think about the unintended consequences in a worst case scenario, and all of the potential harms by using a certain type of technology or collecting certain information. Delete the data that you no longer need. Protect the people that work for you, protect the people that you collect data about, protect your sources, protect your operations, especially if you’re a nonprofit working in fragile or hostile contexts. Store your data securely. 

Our approach to responsible data also implies that openness and transparency should be part of the discussion as well, but many people still see those two (privacy and transparency) as opposing forces. Part of our own internal responsible data policy means we make our own sources of our revenue apparent so people know how we’re getting funded and by whom. Finally, we always encourage organisations to use technology solutions and data management solutions that were designed with openness as a principle in mind, ideally alongside or by the very communities who will be most affected by these technologies.

What changes would you like to see around how technology, which makes use of this data, is designed? Or is it about more legislation governing its use? 

There is no simple solution. We will need to both mobilise and educate ourselves, as well as introduce stricter regulation – on data protection, AI-fueled tech solutions, IT procurements, tech industry monopolies, online ad transparency, bans on facial and emotion recognition, etc. There seems to be growing demand for stricter regulation among the public – research by the Ada Lovelace Institute for instance shows that most people in the UK want their government to impose stricter restrictions on facial recognition technology. 

But I also think that protecting ourselves against already existing digital solutions won’t be enough. We really need to challenge our assumptions about the norms that guide the design of technology systems in the first place. Tech and racial justice scholar, Ruha Benjamin, says technology should ‘move slower’ in order to be able to empower people, instead of Silicon Valley’s current motto of ‘moving fast and breaking things’. I couldn’t agree more. We have this flawed notion of how technology and innovation should work – we think that progress should be fast and linear and we tend to forget about those harmed by innovation, or those left behind. That’s just not a sustainable approach to our collective well-being. Alternative approaches like Sasha Costanza-Chock’s design justice approach, data feminism, or Adrienne Maree Brown’s emergent strategy concept can help us reimagine our digital futures.  

What can people do to educate themselves and protect their online data today?

I think we need to start using our collective bargaining power as voting citizens (plus the tools provided by the law like data protection laws or freedom of information regulations). We need to demand more information and keep our governments and the private sector accountable and under check. How are emerging technology solutions being used by government agencies? How are procurements for IT contracts and tech solutions being carried out and what factors influence the choices made in government IT contracts? How is algorithmic decision-making fueling everything from recruitment to advertising in both the public and private sector? What type of data is being collected about us, by whom? How is this information being managed? Shared with? By whom? How can we take back control of the data that is being collected about us everyday? Citizens have much more power than they think when it comes to scrutinising our decision-making elites, and ironically, emerging technologies can also help mobilise that support at a grand scale.

Your data should not be for sale. We’re taking Oracle and Salesforce to court for the misuse of millions of peoples data and we need your help! If you believe that tech giants should be held accountable for their use of people’s data please support our claim here. Because your privacy matters.

Steun onze zaak tegen Oracle en Salesforce.

Aanmelden

Toon je support

Door je gegevens hier achter te laten steun je onze zaak tegen Oracle en Salesforce. We gebruiken je volledige naam en mailadres mogelijk om in de juridische procecure aan te tonen hoeveel mensen actief hun steun hebben gegeven aan The Privacy Collective.

Support form | home & popup

- Je steunt de zaak van The Privacy Collective tegen Oracle en Salesforce.


- Je maakt deel uit van de groep van benadeelden.

Je maakt deel uit van deze groep als je sinds 26 mei 2018 vanuit Nederland cookies geaccepteerd hebt van Oracle en/of Salesforce en op dit moment in Nederland woont. Deze cookies zijn aanwezig op populaire websites zoals nu.nl, booking.com, marktplaats.nl, bol.com, buienradar.nl, telegraaf.nl, funda.nl en amazon.nl.


- The Privacy Collective gebruikt jouw voor- en achternaam en emailadres om jouw steun aan te tonen in de procedure tegen Salesforce en Oracle en om contact met je op te nemen over het verloop van de procedure.

Waar mogelijk worden persoonsgegevens gepseudonimiseerd. Lees hier het volledige privacybeleid van TPC.


Door op 'Aanmelden' te klikken, bevestig je het bovenstaande.

Dit veld is bedoeld voor validatiedoeleinden en moet niet worden gewijzigd.