Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All

Friday, December 26, 2014

The Inevitable Cruelty of Algorithmic Mediation

Also posted at the World Future Society.

On Christmas Eve, Eric Meyer posted a devastating personal account reminding us of the extraordinary cruelty of the lived experience of ever more prevailing algorithmic mediation.

Meyer's Facebook feed had confronted him that day with a chirpy headline that trilled, "Your Year in Review. Eric, here's what your year looked like!" Beneath it, there was the image that an algorithm had number-crunched to the retrospective forefront, surrounded by clip-art cartoons of dancing figures with silly flailing arms amidst balloons and swirls of confetti in festive pastels. The image was the face of Eric Meyer's six year old daughter. It was the image that had graced the memorial announcement he had posted upon her death earlier in the year. Describing the moment when his eye alighted on that adored unexpected gaze, now giving voice to that brutally banal headline, Meyer writes: "Yes, my year looked like that. True enough. My year looked like the now-absent face of my little girl. It was still unkind to remind me so forcefully."

Meyer's efforts to come to terms with the impact of this algorithmic unkindness are incomparably more kind than they easily and justifiably might have been. "I know, of course, that this is not a deliberate assault. This inadvertent algorithmic cruelty is the result of code that works in the overwhelming majority of cases." To emphasize the force of this point, "Inadvertent Algorithmic Cruelty" is also the title of Meyer's meditation. "To show me Rebecca’s face and say 'Here’s what your year looked like!' is jarring," writes Meyer. "It feels wrong, and coming from an actual person, it would be wrong. Coming from code, it’s just unfortunate." But just what imaginary scene is being conjured up in this exculpatory rhetoric in which inadvertent cruelty is "coming from code" as opposed to coming from actual persons? Aren't coders actual persons, for example?

Needless to say, Meyers has every right to grieve and to forgive and to make sense of these events in the way that works best for him. And of course I know what he means when he seizes on the idea that none of this was "a deliberate assault." But it occurs to me that it requires the least imaginable measure of thought on the part of those actually responsible for this code to recognize that the cruelty of Meyer's confrontation with their algorithm was the inevitable at least occasional result for no small number of the human beings who use Facebook and who live lives that attest to suffering, defeat, humiliation, and loss as well as to parties and promotions and vacations. I am not so sure the word "inadvertent" quite captures the culpability of those humans who wanted and coded and implemented and promoted this algorithmic cruelty.

And I must say I question the premise of the further declaration that this code "works in the overwhelming majority of cases." While the result may have been less unpleasant for other people, what does it mean to send someone an image of a grimly-grinning, mildly intoxicated prom-date or a child squinting at a llama in a petting zoo surrounded by cartoon characters insisting on our enjoyment and declaring "here's what your year looked like"? Is that what any year looks like or lives like? Why are these results not also "jarring"? Why are these results not also "unfortunate"? Is any of this really a matter of code "working" for most everybody?

What if the conspicuousness of Meyer's experience of algorithmic cruelty indicates less an exceptional circumstance than the clarifying exposure of a more general failure, a more ubiquitous cruelty? Meyer ultimately concludes that his experience is the result of design flaws which demand design fixes. Basically, he proposes that users be provided the ability to opt out of algorithmic applications that may harm them. Given the extent to which social software forms ever more of the indispensable architecture of the world we navigate, this proposal places an extraordinary burden on those who are harmed by carelessly implemented environments they come to take for granted while absolving those who build, maintain, own, and profit from these environments from the harms resulting from their carelessness. And in its emphasis on designing for egregious experienced harms, this proposal disregards costs, risks, harms that are accepted as inevitable when they are merely habitual, or vanish in their diffusion, over the long-term, as lost opportunities hidden behind given actualities.

But what worries me most of all about this sort of "opt out" design-fix is that with each passing day algorithmic mediation is more extensive, more intensive, more constitutive of the world. We all joke about the ridiculous substitutions performed by autocorrect functions, or the laughable recommendations that follow from the odd purchase of a book from Amazon or an outing from Groupon. We should joke, but don't, when people treat a word cloud as an analysis of a speech or an essay. We don't joke so much when a credit score substitutes for the judgment whether a citizen deserves the chance to become a homeowner or start a small business, or when a Big Data profile substitutes for the judgment whether a citizen should become a heat signature for a drone commiting extrajudicial murder in all of our names. Meyer's experience of algorithmic cruelty is extraordinary, but that does not mean it cannot also be a window onto an experience of algorithmic cruelty that is ordinary. The question whether we might still "opt out" from the ordinary cruelty of algorithmic mediation is not a design question at all, but an urgent political one.   

1 comment:

Tomi Dufva said...

Good post, thank you ! I also wrote about this, from a bit different viewpoint though: http://www.thispagehassomeissues.com/blog/2014/12/27/tis-the-season-to-be-algorithmically-presented