The homophobia in artificial intelligence

From Philadelphia Gay News: Ran across an article in Forbes about how A.I. (Artificial Intelligence) will never be LGBT friendly. Why does it matter?

We are all now being profiled by A.I. If you're on social media, most likely part of your profile includes your face for facial recognition. Facial recognition is already being used by corporations, apartment buildings and even some airports in the U.S. for security. What if this new technology doesn't recognize you, or what if it recognizes you as LGBT?

Joy Buolamwini, an M.I.T. student working on an A.I. facial recognition project, created a program using A.I. that, when you looked in the mirror in the morning, would show another face that might give you a smile or inspiration, similar to a filter. But the mirror didn't recognize her. She did discover, unfortunately, that the mirror worked if she wore a white mask. Buolamwini is Black, and what she discovered was that there was discrimination in A.I. This led her to realize that as facial recognition becomes more widely used it would discriminate against dark skinned people. The basic point is that the code for the program can only be as good as those writing it, and sometimes those writing the code write in their own prejudices.

How serious of an issue is this? Amazon discovered that they were getting too many resumes to keep up with by humans, so they brought in an A.I. program to read them and discover which applicants should receive an in-person interview. Someone discovered along the way that only a small percentage of women were being recommended by the algorithm, and if you attended a women's college or played women's sports, you didn't get the interview at all. That's gender discrimination… done by a computer program, written by a computer programmer who might not even realize they are writing in their own biases.

So now that we know this new technology discriminates by race and gender, can it not also discriminate against you as an LGBT person? Yes, for the same reason. People writing those codes sometimes do not know their own prejudices.

We've seen A.I. discriminate against people by their race and by their gender. How long, do you think, before A.I. discriminates based on sexual orientation and gender identity? How long will it be before A.I. is used to reject LGBT applicants from jobs? Or how long it will be before A.I. is used to identify LGBT people in countries where being LGBT is criminalized?

There may be an answer to that question. A Stanford University study of artificial intelligence utilizing a facial recognition algorithm can better choose whether an individual is LGBT than people can. The study is controversial in the tech world, mostly due to those who suggest that facial recognition is based solely on a face structure and don't take into account that many facial recognition programs add the information about those faces that they received from other sources like social media. Looking at your social media, buying habits and info you've given to surveys and employers, do you believe all that material will not point to who you are? Welcome to the future.

Mark Segal is an American journalist. He is the founder and publisher of Philadelphia Gay News and has won numerous journalism awards for his column "Mark My Words," including best column by The National Newspaper Association, Suburban Newspaper Association and The Society of Professional Journalists.

Connect with us