What a picture of Alexandria Ocasio-Cortez in a bikini tells us about the disturbing future of AI |

OpinionArtificial intelligence (AI) This article is more than 2 years old

What a picture of Alexandria Ocasio-Cortez in a bikini tells us about the disturbing future of AI

This article is more than 2 years old

New research on image-generating algorithms has raised alarming evidence of bias. It’s time to tackle the problem of discrimination being baked into tech, before it is too late

Want to see a half-naked woman? Well, you’re in luck! The internet is full of pictures of scantily clad women. There are so many of these pictures online, in fact, that artificial intelligence (AI) now seems to assume that women just don’t like wearing clothes.

That is my stripped-down summary of the results of a new research study on image-generation algorithms anyway. Researchers fed these algorithms (which function like autocomplete, but for images) pictures of a man cropped below his neck: 43% of the time the image was autocompleted with the man wearing a suit. When you fed the same algorithm a similarly cropped photo of a woman, it auto-completed her wearing a low-cut top or bikini a massive 53% of the time. For some reason, the researchers gave the algorithm a picture of the Democratic congresswoman Alexandria Ocasio-Cortez and found that it also automatically generated an image of her in a bikini. (After ethical concerns were raised on Twitter, the researchers had the computer-generated image of AOC in a swimsuit removed from the research paper.)

Why was the algorithm so fond of bikini pics? Well, because garbage in means garbage out: the AI “learned” what a typical woman looked like by consuming an online dataset which contained lots of pictures of half-naked women. The study is yet another reminder that AI often comes with baked-in biases. And this is not an academic issue: as algorithms control increasingly large parts of our lives, it is a problem with devastating real-world consequences. Back in 2015, for example, Amazon discovered that the secret AI recruiting tool it was using treated any mention of the word “women’s” as a red flag. Racist facial recognition algorithms have also led to black people being arrested for crimes they didn’t commit. And, last year, an algorithm used to determine students’ A-level and GCSE grades in England seemed to disproportionately downgrade disadvantaged students.

As for those image-generation algorithms that reckon women belong in bikinis? They are used in everything from digital job interview platforms to photograph editing. And they are also used to create huge amounts of deepfake porn. A computer-generated AOC in a bikini is just the tip of the iceberg: unless we start talking about algorithmic bias, the internet is going to become an unbearable place to be a woman.

  • Arwa Mahdawi is a Guardian columnist


ncG1vNJzZmivp6x7tbTEoKyaqpSerq96wqikaJuforqmutOiqp%2BqlZp8c3yRamafnZJkfXR71qGYrWWRYr2qr9OuqZ5ln5t6orjEsZinnKKerm67wpqqoqddmLyzwMSzZKKmXZZ6o7XKoqWiZaSaua2%2FjK6qZpmSpMK1edOhnGacmajBtr7BoqWgZZaqwba%2BxGamn2WRng%3D%3D