Mirror, Mirror: The lure of the AI selfie

At the end of 2022 AI avatars, generated from selfies uploaded to an AI art generator app spread like wildfire on Twitter.  There are a few AI art generator apps but LENSA AI was far and away the most popular app to experiment with. Lensa AI takes these pictures and then spits out a series of avatars in different art styles or themes including kawaii. There are other apps that offer similar products, with slightly different art styles. Dream by WOMBO had a particularly horrifying medieval option which supplied me with the below interpretation of my face.

Make it medieval

However, this outcome is not as likely as smooth, pretty, youthful renditions of my face.

Some options made me ethnically ambiguous and co-opted aspects of Indigenous culture to create a ‘theme’. For example, one of the themes in an AI art app gave me some Kakiniit (facial tattoos) that are a traditional cultural practice of the Inuit. Kakiniit are currently experiencing an important cultural revival, after missionaries attempted to stamp out the practice after their arrival in arctic communities over 100 years ago.

Confronted with the Indigenised version of my face, as a white woman living on stolen land I felt wrong-footed and horrified, that the AI generator had taken a culturally specific practice and given it to me to ‘try on’. I was newly reminded that the data that generator AI art is (almost) indiscriminately vacuumed up from the web and spat out again in response to ‘prompts’. Undoubtedly the AI saw a photo of me in a large faux-fur trimmed hood, covered in snow and used those elements to apply Inuit ‘style’ features to the avatars it generated with little thought or care about cultural specificity. 

AI ‘art’ is part of the white, colonialist techno-culture that created it. Difference is stripped of its cultural and historical context and becomes another thing for us to try on for size, like so many Orientalist gazes before.


While this specific case is troubling, I found the whole exercise jarring. AI removed my reading glasses, and made me non-smiling, serious. It showed me a pretty, younger version of myself. I had no pores, and suddenly, a perfectly symmetrical face. Looking at the AI generated images of myself I felt sad and upset and maybe a little bit like I wanted to cry. It was me, but not me. Me as AI thought I should look according to prevailing beauty standards.


My face is smaller, cuter. I can recognise myself in the images, but also, not at all.  My selfies had been run through a filter that almost added normative ideas of beauty in addition to the art style, washing out everything unique and human about me. I had become the most generic version of myself. All this before we even get to where this art is coming from and the labour that underpins its production. AI art generators are trained on images collected from the web. They are broadly speaking in the public domain, although they are not without copyright. As Clare Southerton says, it’s “a beauty filter made of stolen artists' work”. 


As art is (re)produced through AI it loses the relationality and cultural specificity that makes art, art. It has no human point of view and it shows, as everything regardless of the art style requested begins to look and feel the same. It is, after all, machine generated, and even as a machine may be prompted to create difference it still produces a standardised product.


At the same time, in a related albeit slightly different arena, a friend of mine sent me a photo of a professional sports team put through a ‘hot girl’ filter. Over text he joked about how hot some of women were, rendered in emphasised femininity. They now looked like they wouldn’t be out of place on an American version of The Bachelor. Long, wavy hair, with heavy, contoured makeup. To me this example also highlights the problem with AI art generators is the same as it is with all AI.  It reproduces cultural biases and values and in doing so locks us into repeating past patterns, behaviours and values. It feels to me as it reflects and reproduces dominant norms back to us that it narrows all the ways it is possible to be a person in the world. 


Finally, I want to think about why we are so concerned with how we are perceived by machines. Why do want to know how AI sees us? Why are we prepared to accept that (joking/not joking) the For You Page on TikTok knows us better than we do? Machines have long been our mirrors. Now fed by the data we give/they have taken, we gaze at ourselves reflected in them thinking they are capable of seeing and knowing us better than we know ourselves. In doing so, we cede some of our selfhood to technological assemblages of code, software and devices that (in the case of generative AI) run on stolen data, stolen art and stolen valour; entertained by the sleight of hand, ignoring the real human labour that makes the machines look ‘effortless’.