It’s Official: A.I.s are Now Re-Writing History

AutoAwesome Smile

The other day I created a Google+ album of photos from our holiday in France. Google’s AutoAwesome algorithms applied some nice Instagram-like filters to some of them, and sent me emails to let me have a look at the results. But there was one AutoAwesome that I found peculiar. It was this one, labeled with the word “Smile!” in the corner, surrounded by little sparkle symbols.

It’s a nice picture, a sweet moment with my wife, taken by my father-in-law, in a Normandy bistro. There’s only one problem with it. This moment never happened.

The photo is a not-so-subtle combination of this one:

Original One

and this one:

Original Two

Note the position of my hands, the fellow in the background, and my wife’s smile. Actually, these photos were a part of a “burst” or twelve that my iPhone created when my father-in-law accidentally held down the button too long. I only uploaded two photos from this burst to see which one my wife liked better.

So Google’s algorithms took the two similar photos and created a moment in history that never existed, one where my wife and I smiled our best (or what the algorithm determined was our best) at the exact same microsecond, in a restaurant in Normandy.

So what? Good for the algorithm’s designers, some may say. Take burst photos, and they AutoAwesomely put together what you meant to capture: a perfectly coordinated smiley moment. Some may say that, but honestly, I was a bit creeped out.

Over lunch, I pointed all this out to my friend Cory Doctorow. I told him that algorithms are, without prompting from their human designers or the owners of the photos, creating human moments that never existed.

He was somewhat nonplussed. He reminded me that cameras have always done that. The images they capture aren’t the moments as they were, and never have been. For example, he pointed out that “white balance” is an internal fiction of cameras, as light never appears quite that way when it hits our eyes and minds. He recounted that at one time there were webcams that were so tuned to particular assumptions, they simply ignored non-caucasians in their algorithmic refinements of images. White balance indeed: ironic racism, in algorithms.

And he reminded me that while I don’t know the designers of AutoAwesome “Smile!”, I don’t know the guys who designed the image adjustment algorithms in my camera either. And those camera builders had nothing more to do with the eventual image adjustments my camera makes than Google’s programmers had to do with inserting my wife’s face on her body at a different point in time.

And it’s not just cameras of course. After all, “this” is not a pipe. Any history recounted in symbols, whether rendered in images, writing, or even spoken words , is not “what happened” or “what existed”. All histories are fictions. And histories that involve machines are machine-biased fictions.

But I do think there is something different, possibly something portentous, going on with AutoAwesome “Smile!”: a difference in quality and kind. And Cory agreed with me that shades of grey do matter, and not in the sense of exposures on silver halide paper.

What is a more fundamental externalised symbol of a subtle, human feeling than a smile?

You may say that the A.I.s in the cloud helped me out, gave me a better memory to store and share, a digestion of reality into the memory I wish had been captured.

But I’m reasonably sure you wouldn’t say that if this were a photo of Obama and Putin, smiling it up together, big, simultaneously happy buddies, at a Ukraine summit press conference. Then, I think algorithms automatically creating such symbolic moments would be a concern.

And why am I saying “then”? I’m certain it’s happening right now. And people are assuming that these automatically altered photos are “what happened”.

And I’m sure, at some point in the not to distant future, a jury will be shown a photo that was altered without a single human being involved, without a trace of awareness by the prosecution, defence, judge, accused, or victim. And they’ll all get an impression from that moment that never happened, possibly of a husband’s lack of adequate concern soon after his wife’s mysterious disappearance. It’ll be “Gone Girl” with SkyNet knobs on.

And “look who’s smiling now,” the A.I.s will say.


About the author: Robert Elliott Smith is a faculty member in Computer Science at University College London and co-founder of The Centre for The Study of Decision-Making Uncertainty, Plexus Planning Ltd., and Diphrontis Analytics. You can keep up with him on his website or by following him on Twitter. This article originally appeared here.

Discussion