Study Finds that Self-Driving Cars Are ‘Racist’
Despite a recent study that shows self-driving cars are blind to certain skin tones, Automologist Ling is optimistic that they will learn to see all people.
A study out of the Georgia Institute of Technology has sparked headlines that suggest self-driving cars are or will be racist.
Here is what the study did. It took a set of images of pedestrians and categorised them according to the Fitzpatrick scale. The Fitzpatrick Skin Typing is used to predict your tendency to burn when exposed to UV light and that is determined by the amount of pigment in the skin—in other (less PC) words, how dark or fair you are. Categories 1 to 3 are fairer than categories 4 to 6.
The researchers then tested advanced object-detection models to determine how accurately they detected the presence of fairer and darker people. The systems were 5% less accurate at detecting darker skinned pedestrians, even after taking into consideration time of day and partially obstructed view of them. And thus, some articles are labelling self-driving cars as racist.
Now, I wouldn’t go that far. I remember when voice recognition technology was just emerging and my phone could not understand my Malaysian accent, no matter how slowly or carefully I enunciated each word. There was algorithm bias, for sure: the early data fed to the AI was by individuals sequestered somewhere in an office in Silicon Valley, where I’m guessing there were very few Malaysians.
But I wouldn’t say that the voice recognition technology was racist. Just today, I was Ok, Googling to my phone to ask the meaning of life and to add events to my calendar; it understood me just fine. My accent hasn’t changed but the AI has come to become more ‘inclusive’.
And the Georgia Tech study has some flaws. Mainly, as pointed out by Vox, it didn’t actually test any object-detection models presently being used by autonomous cars; the car manufacturers of course would want to keep their tech secret.
Whilst it didn’t cost me my life that I wasn’t able to simply tell my phone to “Call Mum”, a car that cannot ‘see’ me could just drive into me. The study is, thus, timely as autonomous tech is furiously being developed. There will probably always be algorithm bias (and by that, I really mean human bias), but what’s important is to learn and react quickly to include all people who will use the technology. When autonomous cars hit the public roads en masse—which by the looks of things, quite soon—I’m optimistic that their ‘eyesight’ would have improved to see people of all colour.