So for all our transgressions in the sea of history, amongst the whitewash of indecency and the salty air that now laps, we seem to be doing alright.
Companies Mentioned
Coins Mentioned
A preliminary guideline for user rights in an unprecedented age of rapid technological advancement and fractionalised authority.
Never interrupt your enemy when he is making a mistake. — Napoleon Bonaparte
So for all our transgressions in the sea of history, amongst the whitewash of indecency and the salty air that now laps, we seem to be doing alright.
However, in any sea there’s the surfers and the sunbathers and they’ve all been left in the sun for too long. No one’s checked up on us while we bop on the waves and simmer on sand. It’s only now, when our skin feels garnished in sulfur and numbness boiled till soft, that we know we’ve made a mistake. As the sunbathers bark and writher in the sand, the surfers are watching on — unsure whether it’ll be the heat or wave that’ll send them sinking.
Maybe instead of our mistakes, it’s just the waves of time that can swallow us.
A full-course meal complete with the grandest trophy fish available. Think of how momentous this API is — it’s cleaned mobile and web apps of friction that was considered unavoidable collateral a decade earlier. With a single tap, a user can pair themselves with a personalised service that only moments earlier knew nothing about them. It’s that pairing that gives life to data with the pulse we see it having today.
Associations, relationships and multi-dimensional datasets are just the initial constructions of how we use data but it’s obvious that leveraging data and filtering it through software-based applications is like flooding pure oxygen into a rocket engine. It’s a blast.
The Industrial Revolution of the 1800s had its beating heart locked in theory from the 1700s such as Newton’s calculus and Lavoisier’s theory of Conservation of Mass. The legs that slowly drove masses of society out from poverty and suffering were levered by steam engines and mass assemblages of industry. With thermodynamics blazing, the world began folding outward. As the work got dirtier, doctors began washing their hands in the early 1900s after water was finally being treated with chlorine. Brushing aside thorns that had kept away billions of eyes before, the world looked deeper and so it was fitting that it continued mining — with our physicists in Nevada boring through the bedrock that was once that of religion’s and eventually splitting the cornerstone of our reality: the atom. Questions that appeared indivisible now featured in the periscope of science. History’s unending quest to partition particles continues today as this is the goal of science.
But the digital age of today rests on the minerals of association. Incomplete datasets, inaccurate recordings and even low-dimensional data is unintelligible. The past ways of separation is incongruent.
In short, the energy as of today cannot be partitioned. And that’s set our surfers and sunbathers into a collision course. While big tech companies offer us luxury, we’re faced with a compelling consideration of our sense of worth.
It would seem we’re shipwrecked.
A recourse to this dependency on data being built upon associations it to invest in a new ways machines can learn. Unfortunately computers today are primed for serial instructions — a rigidity that highlights both the ingenuity of modern computing and yet the unfortunate suffocation that’s come from being absolute in adopting von Neumann architecture. Consider the process, the number of layers and epochs, and sheer computing power relative for 99.9%+ accuracy (ignoring under/over-fitting issues) to just identify the variance amongst the Fashion-MNIST dataset.
I suspect the high-level retribution of machine learning will find synthesis upon a set of neuroscience-based insights, namely:
Decoding the relevant neural processes that guides learning in children. Consider the mastery of a toddler’s vision and dissection of social cues whether it be between their mother or strangers — the mind of a child is built to learn.
The breadth of consciousness amongst species in different classes and its neurobiological underpinnings. By understanding how the minds of non-verbal animals work, neuroscientists will be able to begin surgically revealing the confabulating nature of a neural correlation for consciousness. As for my take, I suspect consciousness will be shown to be an illusion, a twisted conjuring of a mind that seeks rational-based equilibrium and so the brain consequently seeks to validate its self in response to the actions it submits. In short — I think the brain operates under the principle of ‘shoot then ask questions’. What does this suggest? That there’s a set of universal, guiding principles that function subconsciously and in effect govern the behaviour of all brains. Interestingly, I think this is unable to be consciously accessed — similar to our autonomic nervous system but engineered for self-validation: the Validation Nervous System.
Memory. I’ve always been fascinated with the absence of dedicated memory-units in the brain. Unlike solid-state disks or the cloud-computing services of today that casually offer near terabytes in storage, the brain’s storage units seem to be neurons (I’m skeptical of being so certain in this — I think the surrounding glial cells which outnumber neurons 10:1, let alone mostly envelop them, have a roll to play in providing the necessary framework for memory) which dually acts like a capacitor and transistor — both storing and serving electrical energy. If the brain’s mechanism for memory management is revealed, computer scientists will be offered quite possibly nature’s greatest achievement — the mechanism of naturally preserving the past.
Privacy rights cannot be framed in the chiaroscuro lighting that’s been used before to showcase the granted can’s and can not’s of citizens. Extrapolating property rights into the virtual domain falls short at properly signifying the perpetual and shifting bi-lateral partnership users and companies are brought under when data is exchanged. Whereas land can be purchased for a price and a clear boundary established between both parties, data is a dynamically compounding asset and one that a firm has to use in building a house of cards — their dataset.
Just like in a divorce with no pre-nuptial, some couples have their assets split 50/50 — not what’s in their pockets when they sign the divorce papers. There’s an intrinsic value that generates over time which is profitable to both parties and it’s unfair to demand a complete disintegration of gains. You can deactivate your account from Facebook at anytime but that doesn’t require you to break off the friendships you’ve made.
Giving users particular rights that may seem ‘righteous’, such as being able to eradicate all traces of previously stored data, can leave firms vulnerable and crippled if this data is later made unavailable.
There is some middle ground between traditional contractual understandings, such as identity protection, that would be used to facilitate legal frameworks for these data-transactions. However, these pre-existing legislative frameworks fail to acknowledge the dichotomy between the capital opportunity made only available to firms in possession of the user’s data.
Herein lies the key pain point: user’s can’t profit from their data while firms can. With this in mind, I think the guiding principle for online user data rights should be about protecting the integrity of communication between both parties.
To implement this, a few ideas I have at the moment on online user rights are the following:
At the lowest level, data needs to be properly contained and anonymously labelled. Data should be seen as an almost-like radioactive commodity given how explosive it can be for both parties. As such, an independent authority should be constructed to oversee and grant certification that firms are securely storing data and that it is non-identifiable. It’s purpose similar to that of Occupational Health and Safety standards. Importantly, transgressions should be treated with the respect to number of users affected. Again, data is an asset that has its valued compounded and further inflated when more associations can be constructed — using traditional punishment methods generally fail to proportionately bring justice to those responsible. Many regulators make the mistake of making pervasive crimes a cheap endeavour — Facebook was fined $122 million (USD) for misleading EU regulators on how they would manage Whatsapp’s data despite paying $19 billion for the app (a fine of just 0.64% of the saleprice). Recently Facebook was again fined by the EU for mismanagement of data with Cambridge Analytica, a trivial £500,000. As an exemplar, Google’s $6.8 billion fine for using anti-monopolistic tactics shows how mis-stepping companies (with market values nearing $1 trillion) such as these should be handled.
Data is handled by programmers and data scientists, they’re the foot-soldiers that work in hand-to-hand combat. Infantrymen are trained in first-aid with an emphasis on blood-loss and pain-management and it was because of improved training during the Second Iraq War that 99% of patients that reached a hospital within 60 minutes survived. In addition, these same soldiers were also trained about how to implement Hearts and Minds tactics. Irrespective of your opinion on military deployments to the Middle East, military training now encompasses a broad range of skills due to the complexity of battlefields today. Programmers are therefore not just ‘foot-soldiers’ and so we need to re-define what it means to be a programmer. The Russian-led Facebook and Google ads during the 2016 Presidential Election, the corruption of Iran and North Korea’s nuclear/space programme, the power outage in Ukraine, firewalls across China and Egypt along with the host of other cyber-crimes that are committed every few seconds are all done so at the fingertips of programmers. An ambitious and competent programmer could possibly be one of the most dangerous weapons on this planet. With that in mind, programmers need to understand the severity and consequences of their actions. Conscious programming is a derivative of a larger global shift in mindsets with society seemingly embracing a greater sense of responsibility for care-taking and protective-actions that reduce harm to future generations. Given the proliferation and accessability of machine-learning models today along with the ease of advancements being readily communicated via the Internet, it should be evident that programmers are more than just ‘programmers’ — in many way they are guardians. Whether society chooses they complete a set of certified courses, classes on legislative rights or simply extensive background searches is a tactical discussion for later.
Unlike sharing data via an API request like Spotify connecting to your Facebook account, selling data should be strictly governed, if not outlawed. Once competent and efficient mechanisms are made available to trace data’s usage (a theoretical and technical nightmare but an evident necessity which hopefully means there’s an eventual solution available) then perhaps regulation for selling data can be loosened. In the United States, users are by default in an opt-in agreement — a criminally unfair position for many users that are not only bewildered by pages of terminology but joining an app in a temporally sensitive scenario. When was the last time you had a few hours to read over a privacy policy before downloading an app you needed. Think Uber, Facebook, Instagram, Tinder. Consequently, opt-out programmes should be reversed to opt-in as they are in Australia.
Until a mechanism is developed that can properly audit a user’s data usage amongst third-parties and offer such details in a transparent way to users, their claims to share a portion of profits will collapse into an abyss of non-actionable demands. Expecting firms to be able to offer a capital amount for the intrinsic value a user’s dataset offers another firm will possibly open that initial firm to a slew of class action with unending claims of discrimination, racism, sexism etc. However, the need for such tracking measures becomes irrelevant if selling data becomes outlawed.
Hopefully you enjoyed reading, let me know if you have any suggestions!