by Mark Pesce
March 26, 2018
from Meanjin Website

 

 

 

 

 

 

 

For those wholly preoccupied in the self-immolation of Australia's test cricket team, this weekend saw an acceleration of the revelations about Facebook, Cambridge Analytica, and the continued erosion of our privacy.

On the 21st of March an enterprising Kiwi named Dylan McKay downloaded his Facebook profile data - to see just what data Zuckerberg had collected on him.

'Somehow it has my entire call history with my partner's mum,' he tweeted, in amazement.

Facebook, it seems, had been recording his call and messaging metadata - even though he wasn't using the Facebook app to make calls or send text messages. (George Brandis would be proud!)

That this could happen at all is entirely Google's fault.

 

Their Android operating system - powering around 90% of the smartphones ever manufactured - has so many security holes and workarounds that it's a data gatherer's paradise. That's not really surprising, given Google's business model: monetizing all of the data we freely shed. It's never been in Google's economic interest to lock down Android.

 

Throttling its data capture would starve the beast.

Apple - which earns its lucre from gadgets, not from the data they generate - keeps iOS more securely 'sandboxed'. Apps can't peek into calling data records, or contact lists, or pretty much anything else, except in very specific circumstances, and bounded by very specific uses, which Apple inspects carefully when developers submit their code to the App Store for review, prior to approval and publication.

Facebook was merely making the most of the surveillance capitalism business model pioneered by Google and written into their operating system.

That's not to excuse Facebook. At some point its management took decisions to capture and store this data, then allocated the engineering resources to make it all happen.

 

Somehow, Facebook management thought all of this was not just OK, but a welcome extension of Facebook's stated mission of 'helping the world to share'. Now it seems Facebook has been helping itself to the world's sharing.

We've also become aware of 'dark profiles': people who don't have Facebook accounts, but, because of their presence in others' contact lists (and in this newly revealed communications metadata) fully describe their own social networks of connections, simply because everyone else is on Facebook and connected to them, outlining a missing individual in a very distinct shadow.

Even non-compliance does not leave you outside Facebook.

What is to be done?

 

In The Last Days of Reality (Meanjin, December 2017) I asked that question and pointed - somewhat feebly - to a few approaches that might start to build a beachhead of data autonomy away from the increasing profiling of surveillance capitalism.

 

It's become clear that this is easier said than done. Even if the EU or other territory banned Facebook completely (as China does today), data profiling would continue - at the supermarket checkout, every time you tap your credit card, with every search term typed into a search engine offered up 'freely' by Google.

Our dilemma is bigger than #deleteFacebook or any of the thousand seemingly rational solutions offered up by well-meaning geeks, from federated systems (mastodon being an example of this) to wild-eyed promises of blockchain-based social network everything, grounding social identity in implacable mathematics.

 

But we do not have a technology problem.

 

All of these problems were created by the clever use of technologies to achieve ends well beyond those imagined by the users of those technologies.

 

There is no technology that can not be perverted to serve the ends of those who place their own goals above the common good. It's our karma that has landed us here, not our toys.

Our new social experiment - a world where everyone is connected - did not long survive its encounter with the predatory forces of capitalism, descending into a sewer of anger amplification in the name of 'increasing user engagement'.

 

That phrasing makes it sound as though the subjects of this experiment - all of us - live in a world apart from the effects of that amplification. It's more like setting a building on fire while living inside of it.

We have learned an important lesson: we commercialize our social spaces at our peril. Now we confront the immediate challenge of finding our way to the exits in this burning building, as it collapses in ruins.

 

Once safely outside, we confront a greater challenge:

how do we collectively imagine a future where this never happens again...?