Dear CEO – Tijmen Schep (Gr1p) and Sandor Gaastra (MinEZ)

In a world where the pressure to be perfect is mounting I call privacy the right to be imperfect. [TS]


Speaking in this conversation:

Sandor Gaastra, Director General for the Energy, Telecom and Competition department at the Ministry of Economic Affairs.

Tijmen Schep writes for the Gr1p Foundation and is a technology critic and privacy designer who helps citizens, designers and policymakers understand new chances and challenges in our data-driven society.


Sandor Gaastra - Letter 1

Dear Tijmen,

Since we are going to discuss privacy anyway, I might as well tell you something about myself. I am the director general of Energy, Telecom and Competition at the ministry of economic affairs. I am a civil servant: I don’t take political decisions. I prepare them, execute them and make sure they’re monitored.

Eh… just replace ‘I’ by ‘my people’. And, on second thought, leave the ‘my’ out, because I really don’t own everyone who works on for example accessible and affordable networks for electronic communication, which contribute to innovation and economic growth.

Before I get obscured by policy lingo: I was a manager at the police force once and I still am a father, a cyclist, a reader and a holiday maker, amongst other things. I find technology, policy and administration fun and important, but people even more so. And – coincidence or not – privacy finds itself where technology, information and people meet.

Once privacy was rather uncomplicated: you couldn’t pry on your neighbours, publish confidential photos or open letters that weren’t meant for you. This is neatly laid down in law, in terms like ‘protection of the personal sphere’ and ‘confidentiality of the mail’.

But times change and all of a sudden digitalisation has provided us with a lot of work. The basic rights remained untouched, but they had to be extended to the digital domain. Confidentiality of the mail for example was widened last year to also include digital communication.

Half of the Dutch people ‘pay’ for digital services by sharing private information. Social media harvest these data and ‘at the back door’ sell them on to the highest bidder. Who that is? I haven’t a clue. It happens in a split second. But in the data systems of both the platforms and the buyers ever more detailed personal profiles grow.

Everything you buy, watch, like, shout out and take photos of is in it. Where you are, where and with whom you spend the night, where you’re heading. If you’re sick (‘buys lozenges and paracetamol’) or happy (‘orders two white beers on terrace’) or in high spirits (‘listens to St Matthew Passion’).

Possible consequences: you pay too much for concert tickets, you’re not offered cheaper health insurance, or you’re blackmailed or bullied with photos on Facebook. A tough problem, because it goes – almost – unnoticed. Until things go wrong.

Every time personal data leak on a massive scale, there is a call for law making, a privacy police or harsher punishment. It’s understandable that citizens turn to their government, and of course the basic right to privacy can not be eroded. But we shouldn’t create unnecessary thresholds for the free flow of data or for corporate innovation.

That’s why we create a legal foundation to enforce that everyone handles personal data with care, and then we look what we can achieve with less stringent means, like information, a change of behaviour and incentives for market players to be transparent about how they handle our data. If that’s enough? What do you think?

Kind regards,

Sandor

PS: Privacy is personal. If I meet you when I’m out cycling I will kindly say hello, but I am bothered by a photo of me in my cycling gear on a public website. Has someone ever crossed your privacy boundary?


Every time personal data leak on a massive scale, there is a call for law making, a privacy police or harsher punishment.
Sandor Gaastra


Tijmen Schep - Letter 1

Hey Sandor,

Great letter!

Before I get into detail: it’s cool that you want to correspond with me. When I was invited romantic images invaded my mind. I was immediately reminded of the written exchanges people like Darwin and other thinkers used to have. People who needed to hire a pricey portrait painter if they wanted to take a selfie. All of a sudden I realised that our museums are in fact full of selfies.

This week, at a conference on privacy friendly smart devices, I heard a nice anecdote about Socrates. He was of the opinion that you could only think in couples, that thinking was always an exchange. To him, reading a book was not thinking. The dear chap therefore never put pen to paper, because he thought of words as a new technology that could not be trusted. For if we were no longer required to remember everything our brains would turn lazy.

Distrust in new technology is of all times. As is unchained optimism, of course. The challenge is to make the two meet somewhere in the moderate middle. The polder model or third way of technology policy, maybe that’s what the two us will end up with here.

I like how you kick off your letter. Funny: because we have privacy, breaking it can create a bond. I also love cycling very much and I really like reading as well (sorry Socrates). I’m part of a walking club and I cannot possibly resist soggy cakes.

I call myself technology critic and privacy designer – some kind of digital mythbuster actually. I am one of the founders of the SETUP media lab, a cultural organisation that employs humour to provide insight into data issues to a wider audience. My question in life is: how can I help people to look at matters of technology in a thoughtful way?

Because I studied the data industry for such a long time I am so very aware of the ways in which a beautiful thing like this correspondence – two people calmly taking the time to think together – will be guzzled up by algorithms at the same time. The more highbrow words I apply faultlessly, the higher some specialised algorithms will rate my IQ. Do I use the word ‘I’ too often? Then my team player score will possibly go down.

I don’t know for sure, and that – as you said yourself – is the thing: algorithms that compare millions of people can see patterns we can’t conceive. Say: 10.000 people who installed a ‘free’ diabetes app turn out to also have liked Snoop Dogg and claying on Facebook relatively often. When I then like Snoop Dogg and claying on Facebook, the conclusion will be: risk of diabetes + 3%.

That guess is then sold as knowledge and that’s how these ‘free’ services are in the end paid for by for instance your health insurance company (and thereby ultimately by you). Can you anticipate something like that when you click that ‘I Agree’ button on Facebook? I don’t think the average citizen can oversee the consequences. As far as I’m concerned it’s time to roll out bigger guns.

I imagine how in 2023 I might apply for a job at Philips because I want to create privacy friendly thermostats there. But an algorithmic HR firm stands on guard and advises against me. According to the algorithm another applicant from an area with a better post code showed an emotionally slightly more stable use of language, and his collection of Facebook likes was more in line with the corporate culture. Maybe in 2023 I’ll start a company that deals in transplants of Facebook likes.

To answer your question: in fact I feel my privacy boundary is being overstepped constantly. Does that sound weird? I think I may know just a little bit too much about this market, about this technology, and especially about the all too human desire to contain risks that drives all of this. Do you ever experience similar pressure? Or do you see it in people around you? And how would you answer your own question?

Bye!

Tijmen


My question in life is: how can I help people to look at matters of technology in a thoughtful way?
Tijmen Schep


Sandor Gaastra - Letter 2

Hello Tijmen,

Just my luck: having to communicate about complex concepts like privacy with a soggy cake junkie… But I will cast aside my resistance because you raise an important topic with that ‘invasion of privacy’. Exchanging personal information, preferably about slightly ridiculous titbits, forges a bond. And gossip is a universal human need, essential for the creation of friendship and animosity and maintaining the social order in communities. Something like monkeys grooming, behavioural scientists say.

Back in 1875 even the people at Bell Labs believed the phone would never be found in every house (let alone in everyone’s coat pocket): what would people have to tell each other? A lot, we now know. And most of it is social talk: gossip. Same thing with the internet. It started as a communications network for scientists, some user friendly applications were developed and there you are: just about every adolescent produces a constant stream of pics, vlogs and tweets. So that people will talk about them or they can gossip themselves. And because gossip feels awkward in writing, we invented the emoji that everyone sprinkles over their electronic communication nowadays.

All this online self exposure is fun, friendly and hardly ever earth-shattering. But mainly it’s also an ineradicable human desire. And that’s why it can turn perilous or painful: cyber bullying, private photos and videos that go worldwide, sexting, extortion, you name it. These are tough problems. We as a government have learned two hard lessons: we cannot force a change of behaviour. People will still be sloppy with passwords, their posts and whom they share their data with.

And malevolent digital actors (from adolescent bullies to genuine criminals) are notoriously hard to track down and prosecute, if only because they can operate anonymously or are far, far away. That’s why we not only try to tackle the perpetrators, but also want to empower potential victims. We do that with campaigns aimed at awareness and education. Commercials on tv, but also school materials, websites, ad campaigns, etcetera. It helps, but with this approach we only cover part of the problem – or rather: the assembly of problems we capsulise as ‘digital privacy’. It also stretches our stamina as a civil service considerably.

I think one of the reasons why the problem is so tough, is people not really getting the ‘mechanics’ and ‘dynamics’ of all these digital novelties. The speed with which you give away things without actually losing anything yourself. The ease with which you can copy things free of any charge and then share them with half the world, not knowing who’s out there. Can a better understanding of these mechanisms possibly make people more sensible in their dealings with data and digital services? You sparked my curiosity by telling me about SETUP and the employment of humour to provide insight into data issues to a wider audience. Could that be of any help?

Much of our energy as a government now goes to introducing the European Commission’s general data protection regulation, which will come into force on May 25 2018. With that GDPR the present EU framework, dating from the previous century, is modernised and made more suitable for the digital domain.

Companies that process data (just about any company really) must have a transparent privacy statement, make clear what they use data for, and may not demand more data than needed for their services. Companies are held accountable for their handling of data (and therefore will have to keep data on that) and everyone must be able to easily transfer their own data to another service provider (data portability).

Government (the Personal Data Authority in particular) is dealt a more prominent supervising task. All in all it’s a major operation, as we say in The Hague. Does it solve all the problems? Probably not, but we have laid down a firm guideline for the use of personal data, including selling them on. The underlying problem of course is that many people give permission for selling data, by accepting terms of use without giving them any real attention.

In short: making laws is important, but more is needed. Awareness in citizens, ethical acting by companies, market incentives for privacy friendly solutions. We’re in a transitional phase. Companies should see the importance of acting privacy friendly as a unique selling point. So, a matter of stick and carrot.

On your job application in 2023: an American company already assesses requests for loans with artificial intelligence and machine learning. They process 70.000 markers per request (yes, you read that right), comprising the date on which you opened your most recent bank account and your use of language in Facebook updates. In short: it’s there and it’s alive. This issue has far wider implications than privacy and poses new questions for governments.

In the future algorithms will decide on your suitability for a job or your eligibility for a mortgage, but also on the the treatment of your disease. These decisions are better (theoretically that is), but it feels awkward and uncertainty and suspicion lurk. In practice though unwanted things won’t happen easily. For example, in the Netherlands data may not be used to exclude people from insurance, and this is not a matter for discussion either.

As to having my privacy boundary crossed I had think for a bit. I felt like writing: I live such an obedient life and I do so little on social media that it barely bothers me. But you mean something different, something that sounds like an invasion of your autonomy and your right to self determination. Or am I wrong?

In ‘Privacy for homo digitalis’ the lawyers Moerel and Prins say that privacy is ‘an antechamber for other fundamental rights and freedoms of the individual, which together are in turn instrumental for the correct functioning of our democratic constitutional state.’ Do you mean something like that? However, I will get back to that.

It does surprise me though that you say your privacy boundary is overstepped all the time. I read everywhere that the young are more conscious of privacy but also have a more relaxed attitude towards it. Are you an exception or are the reports wrong?

I’d love to learn that from you soon!

Sandor


In the future algorithms will decide on your suitability for a job or your eligibility for a mortgage, but also on the the treatment of your disease.
Sandor Gaastra


Tijmen Schep - Letter 2

Hey Sandor,

I totally agree with you: we are linguistic, social creatures and we comprehend the world by telling tales about it. What better than a juicy story? Whenever adventures in love were once again shamelessly blown out of proportion in my circle of friends, someone would always shout ‘best story counts!’.

I think the young are in a difficult position. They are at the forefront of technology use, but at the same have little else than ‘streetsmarts’ (as Danah Boyd explains) to cope with it. They don’t fathom the real issues surrounding data collection and the loss of liberties. But they all feel the pressure to belong.

I think we do indeed need new words to grasp the dark side of all this chatter. Mainly under this rising social pressure to belong terms like ‘normcore’ and ‘basic bitch’ were invented in school yards. Both describe the trend to dress as normal and inconspicuously as possible.

In one of his last public lectures philosopher Zygmunt Bauman said: ‘Fear of exclusion is the dominant fear of our time. We are not rebelling against the overbearing state, we are rebelling today against being ignored, against being neglected, against being unseen’. Philosophers like Bauman, but also Foucault and Deleuze, described how fear of not belonging, of not being normal, is one of the most powerful human incentives.

Its immense power was beautifully exposed by some VPRO film makers, who wanted to stream their entire lives online for three weeks. The experiment was prematurely aborted: the psychological pressure of ‘having to be your best self all the time’ became too much.

At the same time China attempts to put that force to work deliberately. You probably heard of the ‘social credit’ system, which will give all Chinese citizens an ‘obedience score’ from 2020 on. If your score is low because you think or consume in too much of a Western way, you can soon wave goodbye to a government job, a loan or a visa.

I see the same effects emerging in the West, but in our case they are a side effect of market forces. We don’t see through that, blinded as we are by the ‘fairytales’ from Silicon Valley. I foresee that it could have an equally powerful chilling effect on society here. I invented a term for it: social cooling – the data version of global warming.

You might recognise this: you’re on Facebook and you run into a link, but you’re in two minds about clicking it, because you think: ‘That visit will likely be recorded and it might not reflect good on me later’. When I speak at conferences some two thirds of the audiences already recognise this example. Research points in the direction of emerging self-censorship.

Wikipedia entries on terrorism for example were visited less often after Edward Snowden’s revelations. People feared their visits would be recorded by the NSA. Last month saw Donald Trump demanding the release of data on people who had protested against his policies. Would you still feel comfortable to demonstrate then?

Living in a reputation economy does not only have implications for our democratic processes, it also seriously impacts our ability to innovate. It not only fuels self-censorship but also a culture of risk avoidance. An example: when surgeons in New York were given scores for their work, doctors who took the risk of performing complicated operations were rated lower – because more people died under their knives.

Doctors who didn’t do a thing had high scores, even though none of their patients lived any longer. The surgeons who dared to take risks felt the pressure to perform in an ‘average’ way. It’s not hard to guess the effect of such systems on entrepreneurism and innovation. That is the paradox to me: in the long term the creative industry reduces our creative powers.

That brings me back to your analysis that I consider privacy and autonomy to be the same thing. That’s right. To me the two are fundamentally connected. In a social world privacy is the right to shirk social pressure and conformism, to shape your own ideas and to escape anything mainstream or populist.

In a world where the pressure to be perfect is mounting I call privacy the right to be imperfect. Which in effect makes it the right to be human. Privacy provides the space to think different, an essential condition if we truly want to innovate and not merely copy Silicon Valley’s ‘the more data, the better’ model.

New terms like social cooling will hopefully contribute to spreading this insight. But new laws are also immensely helpful. In that aspect the GDPR is a relief, because it opens the door to finding ethical business models.

Finally, I think we need good examples to make these problems palpable and insightful. SETUP does indeed create humorous examples for a wide audience. During last year’s Dutch Design Week it presented a coffee maker that would serve good or bad coffee, depending on your post code. The lower the ‘status score’ of your neighbourhood, the more watery the coffee. It made the ever increasing influence of data on our lives concrete.

All in all I think I am not an exception, but just someone who is slightly ahead of the pack, because I get the workings and the influence of the data industry a bit better than the average Dutch person. Thankfully critical questioning is on the rise in the general population. Take for example the call for a referendum on the so-called dragnet law.

I think we we will be able to recognise the dark sides of data earlier, and I hope my work will contribute to the prevention of no choice, crash based policies. That’s why I want to end with this question: what do you think is needed to distinguish the baby from the bath water in an earlier stage?

Waves

Soggy cake junkie

PS: I added that American company with its 70.000 markers to creepycompanies.com, a website I launched last weekend to – once again – provide examples of the dark side of data to a wider audience. Thanks for the tip 🙂


Living in a reputation economy does not only have implications for our democratic processes, it also seriously impacts our ability to innovate.
Tijmen Schep


Sandor Gaastra - Letter 3

Hello Tijmen,

Thanks for your letter. Honest, I think you truly brilliantly explain why privacy matters. Not just because of the trouble it may get you into, but also because it is necessary to be yourself, to prevent you from being defined only by what the outside world thinks of you, preconditioned or not. It made me think of Dave Eggers ‘The Circle’. To me the most ominous aspect of that book is big brother sneaking in, veiled in good intentions, warm feelings and noble ideas. But the result is an oppressive world, in which conformism suppresses authenticity.

I felt myself longing for the cluttered interior – with reindeer antlers light – of protagonist Mae Holland’s parents, even though I really do prefer clean and austere design. Eggers pictures a world in which social cooling is on the verge of turning epidemic, I see now. I will take it as a sign of my sound mental health that I reacted so strongly to it. Maybe we should discuss this further over a coffee (after I entered the post code for the Noordeinde, of course ;-))

Your statement on the creative industry also made me think: Economic Affairs is the ministry that stimulates economic growth and innovation, but at the same time protects consumers and their privacy (and many more things, from agriculture to food safety). Over the last few years we made a government wide effort to make the Netherlands and the Dutch people more innovative and enterprising.

And we appear to be successful, given the flourishing creative industry, the expanding startup culture and the innovation in fields like financial technology and agriculture (we rule the world in precision farming, my colleague in ‘agro’ told me). And now you tell me digitalisation and data driven innovation achieve the exact opposite, and put a brake on the ‘true capacity to innovate’! That would be a bad thing, not just for the economy but also for people themselves. On top of that I think it would lead to resistance in society.

By chance I recently read a newspaper article on normcore. Two almost identically dressed girls were stunned when they where told they dress in such a conformist way. They both sported black and white trainers, but one girl’s shoes had round toes and that was something the other girl would never wear. Young people conform, but at the same time still see themselves as utterly unique and authentic.

Food for thought: while China uses social credits to create obedient citizens, we employ similar mechanisms (including self-regulation) to entice providers that let illegal content like child porn and hate speech pass too often to do better. It’s much more efficient than more laws and regulations, which require a giant effort for enforcement, investigation and prosecution. It only works if the providers cooperate. Fortunately they do.

And you keep the people alert with creepycompanies.com. Good! By the way, some people feel that such an AI and big data driven credit checker is in fact a harbinger of a glorious future, in which loans are cheaper (because checks cost less and the market expands) and therefore more readily available for everyone. I can see their point, but it remains creepy that third parties collect and combine data which could be used in my favour or against me in such a way.

The brand-new coalition agreement provides a good indication of the direction the government is heading with its privacy policy. Citizens for example must remain able to communicate with the government by normal mail. The government safeguards the confidentiality of the data it has on citizens: data in general registers and other privacy sensitive information is always encrypted, and DigiD is made safer.

Citizens get more control over their own personal data. They can point out socially relevant offices and organisations which automatically receive a limited amount of personal data if required. In short, the government is looking for the balance between privacy and ease of use when it comes to the data their citizens entrusted them with.

Another consideration in the coalition agreement is even more sensitive. It’s less about the digital domain, but more about the fight against terrorism in general. In case repressive actions need to be taken ‘a critical assessment must be made every time of how far privacy and other freedoms are curtailed’, the agreement states.

That will be the balancing act in the years to come, I think. Lots of cooperation with the corporate world – not just the ‘big boys and girls’ but also innovative start-ups with brilliant ideas on privacy-proofing my phone, my smart thermostat or that discriminating coffee maker. By doing so we will be taught that lesson about the baby and the bath water. And we will certainly employ tougher means sometimes: laying down responsibilities, law making, enforcement and biting sanctions. But I mentioned that before.

I feel like writing another letter’s worth about the new European e-privacy guideline, which together with the GDPR (Algemene Verordening Gegevensbescherming or AVG in Dutch) you so lauded will form the policy frame for privacy and electronic communication. I won’t though. Beware of overloading your fellow humans with policy frames, because they’re even harder to digest than soggy cakes. And we’ll probably speak in person soon in Eindhoven, in de Effenaar. Won’t we?

Looking forward to it! Regards,

Sandor

PS: Speaking of ethical business models and creative start-ups: do you see opportunities to take privacy in the Dutch digital domain to the next level, using technological innovation? Or is that ‘technology fix thinking’?


Tijmen Schep - Letter 3

Hey Sandor,

Thanks man! I find it equally fascinating to learn more on how government works. As Bauman wrote about my generation: I too have a lot in faith in government. I haven’t the slightest doubt about your good intentions. So bring on these policy frames!

I think we’ve come full circle by looking at that ‘consent’ question. Our society is based on the concept that we are capable of overseeing the situation, and then making a level headed consideration. What makes government so special in theory is that it allows its people to think about long-term questions and to involve as many stakeholders as possible. But to be honest, I see many signals that governments are not entirely capable to do so when it comes to technology.

A writer who has fascinated me lately is C.P. Snow. He published an infamous article on the ‘two cultures’. We can not properly oversee most problems in society, he says, because the exact sciences and the humanities have grown so far apart. That is where my question about the baby and the bath water came from: when I was a student it struck me that the humanities had already understood the problems with technology, but their insights didn’t reach far. The ‘TEDx McOptimism’ goes down like cake, while the humanities’ view – its complicated and there are no easy solutions – is more like eating carrots and humus. Nice as well, but not as tasty as soggy cake.

Policy makers often think they need to learn more about the workings of technology to be able to gasp it. That is an option, but there is an alternative route. We can also strive to better understand human desires and dreams, and to see how new technologies always respond to these desires. The smart part of this route: technology changes fast, but these desires have been steady as a rock for ages. Professor Rein de Wilde for example described the vision of ‘the land of milk and honey’ (see the internet of things), and Imar de Vries spoke of the dream of ‘angelic communication’: the desire for perfect mutual understanding and thus the prevention of misunderstandings.

Ethnographer Grant McCracken describes how we always safeguard our hope for a better world somewhere, far away from the messy reality of the here and now. We do that mainly in the future – things will all get better – and he calls that ‘the horizon of expectation’. Long story very short: in the past God (middle ages), politics (renaissance) and (until 2008) ‘the invisible hand of the economy’ offered us a place where we dared to park our hope. Nowadays it appears to be predominantly technology that offers us this haven.

The difficult thing in my work is that I touch on something very profound in people: their hopes for a better world. It’s hard to criticise technology because at heart they don’t want the pedestal to be rocked – we all want to keep our faith in technology. For instance, I see how it is kept kept out of the messy human world by presenting it as ‘neutral’ (algorithms) and as a some sort of inevitable force of nature, which ‘impacts society’ from the outside.

I don’t mean to say that good intentions will always fail or that hope is irrational, far from that. My point is that good intentions must go hand in hand with down to earth long-term thinking. We have to go from ‘best story counts’ to ‘most holistic story counts’. Maybe we can call it ‘sustainable optimism’. The good thing is, it’s not just healthier, but also more powerful. I can’t predict the future, but thanks to my baggage from the humanities I can tell you which predictions of the future are mostly slinky expressions of an ideology.

Take the blockchain for example. It’s one of these technologies that make all my alarm bells ring. You can hear the engineers think: ‘Oh shit, the internet turned out to be a surveillance machine. But version 2.0, the blockchain, will set things straight. It will be incorruptible’. But because there is still too little critical awareness they create a technology with even more authoritarian potential (this is explained in depth on techologiebeleid.nl, a site I launched to provide policy makers with access to the best insights from the humanities).

And so I get to your question. Could we create a market for products that respect our human dignity? We sure can. But to do so we will first have to bridge the gap between the ‘two cultures’, and to involve researchers from the humanities: ethicists, ethnographers, sociologists. Only then will the startup scene finally start thinking about mankind in an adult manner, and hopefully stop spreading simplistic stories, which are in fact mainly made up to part investors from their money, and sometimes take on an almost religious character (singularity).

I can already see the first seeds sprouting, and also big boys like Apple – always well ahead in gauging our desires – regard privacy as a feature now. I very much hope the ministry of economic affairs will stimulate this market. I certainly see opportunities and I think much can be learned from the way organic food was turned into a flourishing trade (your colleague in ‘agro’ knows more about this).

I have no doubt this market will materialise. Privacy (read: autonomy) is such a fundamental human desire. In the next ten years people will start to see how strongly data influence their chances. That data based credit checker will also turn down people because of their data, but that’s a part of the story they’d rather not tell. The Dutch will be Dutch though: only when we begin to feel a negative financial impact from data, we will switch to that smart thermostat, city, door bell, messenger or browser whereby ‘smart’ also stands for ethical.

That leaves me with one question: will the Netherlands be a frontrunner in this?

Let’s discuss it over this coffee in Eindhoven. I hope there will be cake 🙂
Tijmen

Posted in Uncategorized.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.