As Mark Zuckerberg testifies to the US Senate, and grimly accepts the inevitability of regulations on the social media giant, we now have an excellent opportunity to define the bounds of that regulation, to ensure the kinds of malfeasance that Facebook has been repeatedly caught out for never happens again.
Everything centers on transparency. Transparency shines the light onto internal processes intentionally kept opaque in the name of ‘trade secrets’ and ‘competitive advantage’. Code for ‘maximum extractive capacity’. However, as Facebook COO Sheryl Sandberg effectively admitted last week, its users are the product. Those same users can demand rights—privacy rights, and, arguably, moral rights—on how their personal data gets used, by whom, when, where, and for how long.
Yet it’s more than that. All of this starts with the pervasive surveillance of Facebook, which uses ‘cookies’ to track users even when they leave the platform, then uses ‘dark profiling’ techniques to build profiles of those who aren’t users of Facebook. All of that is invisible to us, and exactly where we need to start.
First Law: Data Collection Must Be Transparent
What is being collected about us? Where is it happening? Why? If we do truly live in the midst of a culture of ‘surveillance capitalism’ we should at least have the capacity to bargain for our right to be surveilled or left alone. We can’t even do that simply because most of this monitoring is invisible—intentionally hidden—and therefore can be neither observed nor blocked.
Every connected device that’s sending profiling and usage data back to any source must reveal both the mechanism and the destination—all the destinations—of that data collection. In real-time, as it happens, not months or years after the fact. The user must always be given an easy way to refuse or modulate the data that’s being collected. That’s the first law.
Second Law: Data Analysis and Algorithms Must Be Transparent
Once data has been collected it’s fed into a range of computer programs for analysis. These algorithms extract meaning from the data. What they’re looking for largely determines what they see. That’s how Facebook not-so-subtly became a way for racist landlords to keep minorities out of their properties. All analysis is fraught because what it ignores—or doesn’t even observe—often is more important to the overall story than any data analysed.
The public have no idea how Facebook’s analysis algorithms work. In some cases, not even the creators of these algorithms fully understand them, as they’re the product of machine learning over billions of samples of data. We need to see these algorithms broken down into their constituent parts. We need to open up and study their entrails to understand both the basis of their predictive power and the accuracy. Algorithms using data collected from us must be transparent to us. That’s the second law.
Third Law: Data Sale Must Be Transparent
Once Facebook has collected user data and analysed that data, it considers this analysis its work product—something that it owns and can either use or sell as it sees fit. This is where the tech giant has really run into troubles, as we learn how Cambridge Analytica used those profiles, and Russian propagandists, and … well, pretty much anyone willing to pay for the data.
There’s no reason to believe Zuckerberg when he claims Facebook will do better in the future, because there’s no indication Facebook has ever changed its ways despite repeated privacy violations. If Facebook isn’t going to keep a handle on who’s using its data, then it’s up to the users to do it.
Every user needs to have a ‘dashboard’ that shows in simple, clear diagrams, the path between data collection by Facebook, and its eventual sale to a Facebook customer. Every user needs to be able to refuse every one of these uses, and needs to be able to inspect those data buyers and their algorithms as transparently as they must with Facebook.
When we lose all rights to the most personal data about ourselves, we lose ourselves. To prevent that from happening, data sale must be transparent. That’s the third law.
This is not an exhaustive list. There are many more suggestions that could be codified into legislation. But they’re a starting point—both for regulation, and for our deeper understanding of how profiling becomes profits, and the limits we want to place on that.
Mark Pesce is a futurist, inventor, author, educator, speaker & broadcaster.
Read his essay on Facebook and other emerging altered realities: The Last Days of Reality