The heuristics Google (or Facebook, or whatever) uses to predict our behaviour shape us into less interesting, more malleable, more isolated people. I don’t like that.
The big social networks watch what we do and then offer us interesting new stimuli as we’re browsing, but the rewards emerge from a modelling process that depends on stereotyping consumer behaviour. This has already been observed by, for example, the Filter Bubble, but the critique there doesn’t go nearly far enough. It’s creeping right into the operating systems now; Apple now distributes core libraries for managing consumer behaviour that sit in the bowels of your iPhone.
Not only are we stereotyped from among a set of possible consumer types, the assumption is that we are, should be, and want to be nothing more than individual consumers. There are a whole series of assumptions that I challenge in my writing which I voluntarily surrender when I use Facebook, Google, or other similar networks. In that world, individuals make up society in a freakish libertarian way. The only valid persons are humans; there are no animals, no Bodhisattvas or local deities or trees—yet in my ordinary social behaviour when I walk out the door of this house, there are animals, plants, and shrine deities with whom I am expected to interact socially along with all the human people. Successful social gestures are deeply collusive and require constant subconscious attending to other social actors as we all choose how to stand, walk, speak or guffaw. Goffman was right.
In the Google world, genuinely collaborative behaviour is *not* rewarded, individual consumer behaviour in quantifiable clumps is. It’s an ongoing, adaptive, personalized behaviour modification scheme that presumes certain cultural norms. The maze adapts to build a better maze-rat, and the effect is that my online social face behaves in ways I would regard as brutal, insensitive and anthropocentric offline. It’s like a video game within which it’s acceptable to kill—but in fact many of my real-world economic decisions are made through that video game.
That in turn suggests that the more we use sophisticated, market-driven, social software, the less chance there is that we will actually be any use to our fellow sentient beings.
But it’s good to talk, right?
So here’s my real sorrow. In many years of working with very bright open source software engineers, only a handful of them have ever grasped the collective and ecological nature of society. Many have been ardent fans of Ayn Rand, in a depressingly juvenile way. The vicious ideology of The Cathedral and the Bazaar pretty much sums up the problem. The usual hacker’s answer to oppression is to buy wholly into a near-solipsism in the name of ‘autonomy’.
Yet right now we need a different sort of hacker, a post-Frankfurt School radical green hacker.
We need such people to find a way subvert Google’s own analytic engines such that these tools don’t just allow us to communicate—at the expense of being numbed into pointless consumer behaviour—but actively collaborate against repressive economic models? Could we write our own FOSS distributed analytic engines that _didn’t_ presume consumer-agents as the basic mode of personhood, and inserted results into web searches and sidebar adverts? Cold we write better versions of common libraries that redirect Apple adverts to collective music-making sites? Can we ripple the unctuous surface of online marketing?