top of page
Search
  • Writer's picturePatricia Routh

Artistic Collaboration with an AI

I'm interested in picking apart/learning AI text generated art with a focus on the awareness that there may be a great deal of coded bias written into the AI by the programming and algorithmic training. I was experimenting with a web based AI art generator app called 'NightCafe', (among others).

In these early 'experiments' I was merely attempting to familiarise myself with the application and interface, and was not seeking to expose 'coded bias' in my early results so quickly.


This is my un-AI enhanced digital portrait/illustration (done using the software *Studio Artist and *Procreate) of Carolyn Bryant I did in 2016. I decided to use this as the 'base image' along with the text, Killer, Guilty, and Racist .


For those who don't know, Carolyn Bryant is the woman, who in 1955, accused 14-year-old Emmett Till of 'touching/speaking' to her in the USA Deep South. Her lies caused his brutal beating, lynching and death.

His murder is often cited as one of the galvanising moments for the civil rights movement in the USA.

Also, it should be noted that eventually Carolyn Bryant admitted she lied about Emmett Till due to coercion by her then-husband, who was one of the men who tortured and killed Emmett. None involved in his murder served any time or were punished in any way. Why use the text , 'Killer, Guilty, and Racist'

My intent or hope of putting in those text prompts into the AI programme along with my image of Carolyn Bryant, was to allow or encourage the AI to 'collaborate' with my artwork in a way that might make the image of Carolyn Bryant somehow look even more guilty and more sinister.


Here is the image the AI generated/created from my base image or Carolyn Bryant using the text, 'Killer, Guilty, and Racist'.


Instead of interpreting my text prompts in a way that would evoke 'artistic' expressions of guilty racist white supremacists, it has instead rendered a very racist stereotyped image of a menacing person, with an afro and dark skin under a white mask or makeup, aiming some sort of weapon directly at the viewer. An image that ironically would provide a great deal of confirmation bias about race for people like Carolyn Bryant, her violently racist husband and his equally repugnant friends.


This is disappointing and even shocking a bit, but is also a really powerful example of the limitations of AI and algorithms.

The coded bias that they contain can only be as thoughtful as the person who programmed them and designed them. So the term racist to this particular AI, instead of signifying the people who commit racially motivated violent crimes , prompted the AI to generate a stereotyped racist image instead.


Some algorithms are obviously more nuanced than others. I did the same search, 'killer, guilty, racist' using Google image and the results delivered nothing but white supremacist criminals who had been convicted violent killers of people of colour.


Google Image results:

Travis McMichael,Greg McMichael, and William "Roddie" Bryan, Ahmaud Arbery killers.



Gary Dobson (left) and David Norris killers of Stephen Lawrence

So it's not my search terms... it's whoever/whatever data designed this art generating AI that is perpetuating negative racial stereotypes, albeit unconsciously or intentionally.



96 views0 comments

Recent Posts

See All
bottom of page