Girl, mislabelled

There is a thing I think is very clever in AI, which is to do with describing images for the visually impaired. I’ll always write an alt text label for an image, the text that screen readers will read out to users, but I tend to copy the caption I’ve written. So I’ll say “Banana” where AI will say “Photo of a blackening banana with a best-before date of yesterday.”

Only, I saw an example of this that has depressed me. I saw it on Twitter, it went by in a disappointed flash, I can’t seem to find it again, so maybe I’m wrong. Hopefully I’m wrong.

But it was an example of this thing of using AI to describe images and I believe the thrust of the example was that this was being done here without any bias or AI hallucination. That this description of the image was entirely and impressively accurate, entirely and reassuringly presented without presumption or bias.

The caption read: “Girl on a train, wearing a red hat and smiling at the camera.”

It was wrong. From word one.

I’m astoundingly bad at ages but this was an adult woman, it was not a girl. So by being without bias or presumption, what the makers of this AI tool meant was that it was without any bias other than their own. And none of us consider our biases as biases.

If AI is to become the useful tool that in fact it long ago did, and if it is going to work by systematically stealing information from everyone and everywhere, you would want it to not shriek out at you that its algorithm was written by men.

Brand new technology, same old problems.

Leave a Reply

Your email address will not be published. Required fields are marked *

Blue Captcha Image
Refresh

*