CCTV Surveillance camera operating on traffic road and people cross road in japan

Oodles of people, joined by a coalition of religious organizations, gathered at a nonviolent protest in front of the wedding cake-like architecture of Pasadena City Hall. We were there protesting the murder of George Floyd and many others. It was the end of May, there was beautiful spring weather, and I was standing front and almost center. I remember wondering, “Is my face being scanned by someone?” Probably. I had friends who didn’t want to attend because of that very reason.  

I would not let anything prevent me from being in the socially distant and masked crowd of thousands—people of all colors, ages and sizes. I exulted in the excitement; the fierce calls for justice; the wafting odors of shampoo and aftershave; some people’s dogs decked out with protest harnesses. We were chanting, singing, being together. There was nothing obvious to fear from the crowd. Surveillance? That’s not so obvious, a frightening factor that goes beyond protests. Which brings me to the striking new documentary film “Coded Bias.”

We go about our daily lives on a spectrum of trust regarding our computers and smartphones and the tech that powers them. One end of the spectrum is unconscious of it; the other end is paranoia. As with many things, many of us are in the middle.

Enter “Coded Bias,” which has already made a big splash and put the Silicon Valley Google-Facebook hegemony and its white male bias under a veritable social microscope. The film could be called “Coded by Us,” the “us” in this instance being predominantly cis white men of privilege. Up to now, it was a given that artificial intelligence reflects so-called “natural” intelligence. This film reveals so brilliantly the hidden side of AI and its algorithms—mirroring its creators’ bias and bigotry—and how that flawed tech is being used to negatively impact lives worldwide.

Directed by Shalini Kantayya, “Coded Bias” was a hit at Sundance. It features the ebullient and whip-smart Joy Buolamwini, who as a child dreamed of one day joining MIT’s legendary Media Lab. Joy had seen a whimsical robot the lab had created, and realized that was the life for her, then made it happen. Ms. Buolamwini is the Queen of Tech Nerds who is so kinetic on screen. I wanted to stay with her for hours.

One aspect of the film is tracing Joy’s journey from Media Lab computer scientist to becoming a digital activist via her nonprofit, the Algorithmic Justice League. It started innocently enough when Joy decided to create a fun art project: a mirror that would recognize her face. What she found was that her dark skin rendered her, wait for it… invisible. Then she tried looking into the camera while wearing a white mask and voilá! The more she dug, the more she found that AI and its attending facial recognition algorithms (another word for “rules”) reliably only recognized white male faces. The metaphor of “being invisible” as a woman of color couldn’t be more apt. You can’t make this up, right?

You may wonder, “Who cares about algorithms?” These days, companies use them to hire and fire; universities to sort through admissions; law enforcement to assess threats. In “Coded Bias,” you see first-hand the hiring process using these algorithms that automatically rejects anyone who is not white and male. Seriously.

In a particularly upsetting scene, a young Black man on London’s streets is detained because a facial recognition surveillance “bot” thinks he looks like a suspect. He is only 14 years old and is visibly shaken by the plain-clothes cops who yank him off the sidewalk. They grill and fingerprint him, regarding him as a criminal. In other words, the AI-based system treated him just like a bigot would in real life.

Silicon Valley tech nerds are notoriously sexist and racist and don’t have much interest in transforming their cultures as it works for them. In fact, one of the film’s “talking heads” is a recently fired woman of color who worked at Google. A highly regarded researcher, Timnit Gebru was sacked after criticizing Google’s approach to minority hiring and the biases built into today’s artificial intelligence systems. 

Ironically, Ms. Gebru was a co-leader of Google’s Ethical AI team! That would be like firing a house inspector for reporting a termite infestation. On a positive note, some major companies have hired Ms. Buolamwini and her team to help them delete their mechanized chauvinism.

The film’s theme resonated with me, as I have been trained to see bias after years of studying media and its hidden coding—the subtle messages that become obvious once you break the code. Here’s a classic example: The most recent Costco Connection magazine cover features an altruistic, tall white man standing next to an Indigenous woman. He is named; she remains nameless.

Do we really want to allow the unfettered and nonconsensual capturing of our faces for commercial and law enforcement profiles? This column is too short to dive that deep. That is why you really must see “Coded Bias.” 

 

Ellen Snortland has written “Consider This…” for a heckuva long time, and she also coaches first-time book authors! Who knew? Contact her at ellen@

beautybitesbeast.com.