The endless cookie settings that pop up for every website sound like a joke on an internet that doesn’t want to change. This is very annoying. This is somewhat similar to the revenge of regulators in the data markets, giving the General Data Protection Regulation (GDPR) a bad reputation, and thus it might seem that political bureaucrats again clumsily intervened in the steady progress of innovation.
However, the truth is that the privacy concept put forward by the GDPR will usher in an era of innovation far more exciting than today’s corrupt technology. But, like today, this simply does not happen. What we need is an infrastructure approach with the right incentives. Let me explain.
Detailed metadata is collected behind the scenes
As many of us are now well aware, a continuous amount of data and metadata is produced by laptops, phones, and every device is prefixed with “smart”. So much so that the concept of a sovereign decision over your personal data hardly makes sense: if you hit “no” for cookies on one website, the email will still leave a trail. Delete Facebook and your mom will mark your face with her full name on an old birthday photo, etc.
What is different today (and why a CCTV camera is actually such a terrible representation of surveillance) is that even if you choose and have the skills and knowledge to ensure privacy, a shared environment for massive metadata collection will still harm you. It’s not about your data, which is often encrypted anyway, but how massive metadata streams continue to reveal things at the minute level and make you visible as a target – a prospect or a potential suspect in case your behaviors differ.
RELATED: Privacy concerns are on the rise and blockchain is the answer
Regardless of what it might look like, everyone really wants privacy. Even governments and corporations, especially the military and national security agencies. But they want privacy for themselves, not for others. And that brings them into some kind of mystery: how can the national security authorities, on the one hand, prevent foreign agencies from spying on their own people, while creating loopholes for them to dig?
Governments and corporations have no interest in maintaining confidentiality.
In a language all too familiar to these readers: there is demand, but there is a problem with incentives, to put it mildly. As an example of just how much of an incentive problem currently exists, the EY report estimates the UK health data market alone at $ 11 billion.
These reports, while highly speculative in terms of the actual value of the data, generate an insurmountable loss anomaly, or FOMO, leading to a self-fulfilling prophecy as everyone strives to make the promised profits. This means that while everyone from individuals to governments to large tech companies wants to ensure privacy, they simply don’t have a strong enough incentive to do so. FOMO and the temptation to slip through the back door to make safe systems less secure are simply too strong. Authorities want to know what their residents (and others) are talking about, businesses want to know what their customers think, employers want to know what their employees are doing, and parents and school teachers want to know what their children are doing.
It’s a useful concept from the early history of science and technology research that could help shed light on this mess. This is accessibility theory. The theory analyzes the use of an object in accordance with its environment, its system and the things that it provides to people, that is, what becomes possible, desirable, convenient and pleasant as a result of the action of the object or system. Our current environment, to put it mildly, creates an irresistible temptation to observe everyone, from pet owners and parents to authorities.
Related Topics: The Computer Economy is a Terrible Nightmare
In an excellent book, Software Engineer Elaine Ullman describes programming networking office software. She vividly describes the horror when the manager, having installed the system, excitedly realizes that it can also be used to track the keystrokes of his secretary, with whom he has worked for over ten years. Then there was trust and good cooperation. The new forces inadvertently turned the president into this new program, which took into account the more detailed daily work rhythms of people around him, the frequency of clicks and pauses between keystrokes. Such ill-conceived monitoring, albeit with the help of algorithms rather than people, is suitable for innovation today.