Privacy Day 2019

Privacy Day 2019

While reading Jamie Bartlett's The Dark Net, a single phrase made a distinct impression.

The term ‘the clueless 95%’ was coined in Tim May’s 1994 cypherpunk manifesto Cyphernomicon, an anarchic and elitist world view where ‘those in the know’ would be bound for prosperity and everyone else would be scrabbling at the edges for the crumbs. Twenty-four years later, the same phrase epitomises how the age of electronic communications has created an environment where the majority are almost completely ignorant of the data collection, manipulation and usage practices employed by the organisations they interact with, while a minority take advantageous decisions at their expense. In fact, many people don’t even know who these organisations are, let alone what they are up to.

Today is World Privacy Day, when those of us with a professional interest in data privacy and protection will be postulating ad nauseum about the progress made, bad actors, what needs to change etc. etc. In among this rhetoric though, we really need to ask ourselves two fundamental questions:

  • What are we trying to protect; and
  • Who are we doing it for.

Simplistically, the answers are: individuals’ rights and freedoms; everyone.

However, it is anything but simple. Our information society has far more transactional touchpoints than an industrialised society, particularly in what Prof. Randolf Picker from the University of Chicago describes as the mediafication (sic) of everyday life. Individuals would be hard pressed to know, understand and keep track of them all. Often, people don’t know what they are giving away, or for what purpose, so they don’t know what they need to protect. Delve a little deeper and the potential power of personal data collection, aggregation and manipulation of thousands of data points per person, fuelled by internet enabled devices becomes apparent. The ability to target advertising designed for a specific individual, hundreds of thousands of iterations of the same message, in order to nudge opinion or decision making is immensely powerful.

Continue this discussion to AI and machine learning. Just last year Adam Conner-Simons, CSAIL communications and media relations officer from MIT wrote a blog post where he commented that “Last year’s study showing the racism of face-recognition algorithms demonstrated a fundamental truth about AI: If you train with biased data, you’ll get biased results,” citing the Institute’s 2018 Gender Shades project. The inference is that complex programmes have the potential to take on the bias (or prejudices) of the programmer or engineer. If, as per the current situation in certain sectors of our technology infosphere, AI becomes a proprietary system controlled and monetised by a single company there has to be someone to question this system and hold the actors to account. Without this, the liberty of every citizen, particularly those at the margins, is under threat.

Data protection and privacy specialists play a key role in this process. Acting as the internal business consciousness of the data subject, their knowledge and input is invaluable, not least in ensuring senior management teams are given the time to consider large, data heavy projects when the development, marketing or sales teams are bowling ahead with their enthusiasms. Specifying what you don’t want a tool to do is just as important as its capabilities. Some of the biggest data breaches have been caused by poorly implemented training or security programmes or misguided use of technology. These professionals have a much greater role to play though. A purely legal approach on a case by case basis will see businesses lurching at each development phase, whereas an organisation that embeds compliance at the highest level will see a smooth change in corporate behaviour that embraces privacy and can deliver the trust its customers and clients (and legislators) are looking for.

Which brings us back to our ‘clueless 95%’. Anyone who has ever spent any time trying to clear out a grandparent’s email inbox or teach them how to download an App on their phone (or use WhatsApp!) will understand how even the simplest technology can be incomprehensible. Mobile phones are ubiquitous, but parents forget to warn their children of the dangers of using them to log into a dodgy online game, or register them with the TPS, then have to explain the terminology used in the sex line voicemail left for their blushing teenager. Vulnerable elderly people are left so bemused and frightened by phishing phone calls, emails and letters that they don’t even trust that the police are who they say they are when they phone to offer their help.

Technology companies and their lawyers often explain this away with the phrase “they don’t care anyway”. Well, you can’t care for a pet you don’t own, and you can’t care about intrusive data collection if you don’t know that it is going on. Which is where the privacy community and its many advocates are vitally important.

Our role is to act as the knowledge for those individuals who don’t know; to encourage organisations to see data protection as people protection; to educate and encourage businesses to build trust through being trustworthy; and to ensure that no one is adversely affected by someone else’s bad actions with their personal information.

share