AI researcher MaryLena Bleile says it’s not the robot apocalypse we should fear but the biases the machines are learning

TAMMYE NASH | Managing Editor
nash@dallasvoice.com

ext month — July 18-24 — the International Conference on Machine Learning will hold its 38th annual conference. That conference will include a Queer in AI workshop designed to highlight issues faced by marginalized communities in the AI field through talks and panel discussions on the inclusion of non-Western non-binary identities and Black, Indigenous and Pacific Islander non-cis folks.
Organizers are accepting submissions through Monday, June 14, for presentations during this workshop. Submissions must be generally related to the intersection of LGBTQIA+ representation and AI or be research produced by LGBTQIA+ individuals, according to the Queer in AI website.
This week, Queer in AI member MaryLena Bleile talked with Dallas Voice about the field of machine learning, artificial intelligence and what it means to be a queer person in the AI world.

Dallas Voice: When I think of AI, I always think of Terminator. So, first of all, explain what AI is, just in general layman’s terms, and what it isn’t. MaryLena Bleile: So “Artificial Intelligence” is a broad term that encompasses many different areas of research, all with the general goal of making machines “smart” in some regard. Under that umbrella, the specific computational technique that has seen a major rise in popularity over the past 20 years is called “Deep Learning.”

Think of a machine with a lot of knobs and buttons, each of which controls a machine that does something very simple, like turning on a light switch. Deep Learning is about tweaking those knobs and buttons in a way that makes them all work together to achieve a specific outcome, like turning on light switches in such a way that creates a picture of a 6.

Machines really can’t do anything they aren’t trained to do, so anyone who’s concerned about a robot apocalypse can sleep easily. The most magical thing about Deep Learning is math!

Is there a large queer presence working in this field? It’s difficult to say, really, since many queer people — particularly senior people — in the field choose not to come out for fear of career repercussions. However, from what I’ve read, it’s unfortunately one of the least diverse fields of academia — along with physics, I believe — along axes including gender, race and sexual orientation. Like I mentioned, this is particularly true for the older generation; in the entirety of the time I’ve been working here, I’ve known exactly two out queer professors, both of whom I met through Queer in AI.

Why is it important to elevate the queer presence in this field? Elevation of all minoritized groups is of utmost importance to AI because of how much Deep Learning amplifies bias. Like I mentioned earlier, a machine can’t do anything you don’t tell it to by feeding it data or code. And when you feed it with data generated by a process that is sexist, racist, homophobic, etc., you get a machine that has those qualities as well.

For example, some AI systems can learn sentiment, analogies and the structure of language: how “similar” words are to one another. Google has an AI that does this, and a paper (Saucedo, 2018) revealed that according to that AI — as of 2018 — the words most similar to “gay” are “dumb,” “lame,” the R word, “creepy” and “rude.” The most similar word to “lesbian” was “slut.”

Not only does this show the homophotic bias in peoples’ language, when these systems are deployed in [common] practices such as targeted marketing and advertising, they reinforce the existing structure of bias. There are countless examples of this phenomenon when it comes to racism, sexism and transphobia as well.

MaryLena Bleile defends her Ph.D thesis

Unsurprisingly, research has shown that queer people working in STEM fields are disadvantaged compared to their heterosexual counterparts (Cech, 2017), and the Queer in AI demographic survey has shown that queer people do not feel completely welcome in the community. This is particularly true for queer researchers who are also minoritized along other axes, such as socio-economic status, race and ability type. In the words of one of our organizers, Arjun Subramonian, “Vectors of oppression intersect,” with compounding effects.

There’s been a substantial amount of research on what’s called “minority stress,” that means negotiating prejudice and discrimination — and preserving outness/concealment as necessary while potentially also navigating internalized homophobia, for queer individuals — creates a cognitive load-on that causes depleted mental health and can actually cause increased rates of physical illness (Frost, et al 2015, Meyer, 2003a, 2003b, Meyer & Dean, 1998). This cognitive load makes thoughtful wonder — which is necessary for good research — a lot more difficult.

Personally, I grew up in a community that believed homosexuality was a perversion of nature, and that “the gays [were] ruining [the country].” People won’t generally tell you that at work, of course, because the standards of professionalism in the field prohibit it. But, because it’s never spoken of, it can be very difficult to figure out which, if any, of one’s colleagues privately hold similar opinions. And needless to say, working with or for someone who thinks you are ruining the country by loving women, is not ideal.

From what I’ve heard, my experience is not uncommon among LGBTQ+ folks in the field; Queer in AI provides advocacy and community for those of us in such a situation.

Tell me about the upcoming conference overall, and about the Queer in AI workshop specifically. What is your goal for this workshop and for the conference? Research has shown time and time again that the voices of queer individuals — particularly trans, non-binary, non-western and BIPOC queer individuals — are under-represented and feel unsupported in the field. In our demographic survey, many queer individuals cited a lack of role models and community as issues that propagate the underrepresentation.

The ICML workshop is a step towards the ultimate goal of equality, where people are not minoritized in any regard. Specifically, the Queer in AI workshop highlights and works to resolve the issues queer individuals face by featuring Black, Indigenous, and Pacific Islander non-cis folks as well as talks and panel discussions on the inclusion of non-Western non-binary identities.

We also have an optional buddy system for community-building, free undergraduate mentoring and a poster session to highlight the work of queer individuals, with free conference registration for presenters. We don’t desk-reject any submissions, and submissions can be in any format.

Why is this field/topic important to the world in general, and to the LGBTQ community in specifically? AI is powerful. From self-driving cars, to personalized/adaptive treatment planning and devices like Amazon Alexa — which one of our organizers actually helps develop! — it has achieved an overwhelming presence in our lives and in the anticipated future of technology. But this power can be used to reinforce and amplify the bias present in datasets constructed by biased individuals, which presses the need for cultural reform and equality.

Tell me about yourself and your work. How did you get into this field, and what do you do? I’m a Ph.D candidate at Southern Methodist University and UT Southwestern. My dissertation research takes place out of the Medical Artificial Intelligence and Automation lab in UT Southwestern’s Medical Physics department, where I’m part of a team that’s working on an AI system to control and optimize radiation treatment for tumor elimination. The agent observes the tumor as it grows and makes adaptive decisions about the radiation treatment protocol.

I love it because of how practically useful it is and because it combines a bunch of fields that fascinate me. Getting to work at the intersection of statistics, math, computer science and physics — and in an environment that pushes me to improve in a way that’s kind and supportive, no less — is living the dream, in my opinion.

How I got into the work is kind of a wild ride! I come from a devout Christian family where music is sort of the family trade, so I’ve been playing cello since I was 3 and planned to join a convent after I finished my music degree. After getting rejected by the convent, I started to realize how much I enjoy math and science, got a stats degree and started coming out. I knew I wanted a Ph.D almost as soon as I stopped planning on becoming a nun, before I started science, because I felt like an undergraduate degree just didn’t give me the depth of knowledge I wanted. And that’s just my personality; whatever I’m in, whatever I have expertise in, I want to do it or know it 110 percent. I still love music and art, particularly heavy metal.

What have I not asked about that you think is important for people to know? Introducing mathematics and machines into decision processes doesn’t remove the issue of bias; in many cases such an introduction amplifies it. There’s actually a documentary called Coded Bias that discusses algorithmic bias in more depth. It’s a useful and informative film, but it should be viewed with the knowledge that it’s shot from a very Western/America-centric viewpoint which vilifies America’s political opponents in a problematic way.

For more information about the Queer in AI Workshop, visit the website at https://Sites.Google.com/view/Queer-in-AI

References for Question 3: Frost, D.M., Lehavot, K. & Meyer, I.H. Minority stress and physical health among sexual minority individuals. J Behav Med 38, 1–8 (2015). https://doi.org/10.1007/s10865-013-9523-8; Meyer, I. H. (2003a). Prejudice, social stress, and mental health in lesbian, gay, and bisexual populations: Conceptual issues and research evidence. Psychological Bulletin, 129, 674–697; Meyer, I. H. (2003b). Prejudice as stress: Conceptual and measurement problems. American Journal of Public Health, 93, 262–265; Meyer, I. H., & Dean, L. (1998). Internalized homophobia, intimacy, and sexual behavior among gay and bisexual men. In G. M. Herek (Ed.), Stigma and sexual orientation: Understanding prejudice against lesbians, gay men, and bisexuals (pp. 160–186). Thousand Oaks: Sage Publications; Cech, E.A. and Pham, M.V., 2017. Queer in STEM organizations: Workplace disadvantages for LGBT employees in STEM related federal agencies. Social Sciences, 6(1), p.12; Saucedo, R., 2018. GAYTWITTER: AN INVESTIGATION OF BIASES TOWARD QUEER USERS IN AI AND NATURAL LANGUAGE PROCESSING. THE SLU MCNAIR RESEARCH, p.60.