INTERVIEW

"Coded Bias" exposes how facial recognition discriminates based on race, gender, and more

Filmmaker Shalini Kantayya spoke to Salon about the need to hold companies accountable for tech & why there's hope

Published June 17, 2020 5:00PM (EDT)

Coded Bias (AFI DOCS)
Coded Bias (AFI DOCS)

As unsurprising as it is insidious, facial recognition algorithms are biased. They discriminate against race, ethnicity, gender, and even ability. The technologies are, shockingly, unregulated in the U.S. As a result, the algorithms can be — and are — widely and freely used in predatory ways to exploit the poor and vulnerable. They reduce social progress, create unwelcome influence (in hiring practices, college admission, and advertising for example), and even have profound implications in law enforcement. These algorithms are oppressive; they can cause people — and people of color especially — to lose their jobs, wealth, housing, personal freedom, and security, as well as have unequal access to health care. In Pennsylvania, algorithms are designed to predict recidivism rates and used by judges to determine sentencing guidelines.

"Coded Bias" is Shalini Kantayya's urgent, cogent documentary, screening June 20 at AFI DOCS, that chronicles the ways facial recognition technology and associated algorithms are impacting our lives. Kantayya focuses on the heroic and necessary efforts of Joy Buolamwini, an MIT researcher who made an "aspire mirror" as part of a project only to discover that the facial recognition software did not register her skin color. However, when she donned a white face mask, it did. Is it possible that machines have been designed to have unconscious bias and recognize white men because they have been designed by white men?  

Kantayya uses Buolamwini's findings as a jumping-off point to investigate other asymmetrical power situations, such as a Brooklyn apartment building that monitors its tenants and a UK organization called "Big Brother Watch" that fights against the harmful and inaccurate use of facial recognition cameras by the police. 

"Coded Bias" also showcases mathematician Cathy O'Neill, author of "Weapons of Math Destruction," who explains the nefarious and unethical uses of algorithms, big data, and technology to make predictions that further racial and economic inequality. 

The filmmaker chatted with Salon about her new film and the perils of the algorithms.

First, let me ask you what websites you most frequently visit, what you have ordered from Amazon this year, and what is in your Netflix queue? Don't think of this as profiling. I'm just looking to get to know you better.

I would say in "Devs" the Alex Garland series. Off the top of my head, I watched the Jeffrey Epstein documentary. Watch that before you work out, because you'll want to hit something. I'm writing a film that involves a kidnapping so I'm watching a lot of kidnapping things. The Patty Hearst documentary. It's strange if you are profiled about what you're looking at. 

How did you find Joy Buolamwini, and determine there was a film here? Did Joy come first, or did you have the idea and then find Joy?

I think with documentary, there's a process of casting and throwing spaghetti against a wall and finding a story. With Joy, there was enough there. You shine a light on ordinary people doing extraordinary things. She was charismatic and communicated the issues well. You get an inkling something is going to happen, but you don't know what. When she testified to Congress, I knew I had a strong doc. 

What I respect about your film is that you interview women and people of color almost exclusively. What can you say about that decision, and selecting your talking heads?

I find it funny that every screening has asked that question, but films with all white men talking about AI are never asked that. I welcome the question, but I hope that this challenges that notion altogether. As a woman and person of color, I am conscious of who I set up as authority figures. Women are leading this. I noticed that making the film. It's not a women's film, and you'd never know that women are minorities in the AI and tech documentary space. It became a more deliberate choice, but the people leading this cutting edge of science – women have an identity as an outsider that allows them to see things in the industry that most don't see or overlook. 

You unpack, investigate, observe, and explore the various ways facial recognition technology and algorithms are used. Your film certainly raises tremendous concerns, but what did you discover making this film that disturbed you to a degree that it changed your behavior?

I barely use social media because it is the enemy of deep thinking. What was most alarming was this technology talks about privacy, which has implication of property rights. But I came to fear invasive surveillance and systems employed at mass scale that have not been tested for accuracy or bias about health care, jobs, and prison sentences, and the implicit trust we have in these algorithms designed and employed by a small group of people.

Just this past week, IBM got out of the game, and Microsoft is not going to sell research facial recognition technology to police. Amazon pressed pause for one year. They were already selling technology that was racist and deployed on a population with no governing position guiding this behavior. And three genius tech women figured out this was racist. That can't be the case. We need more than these three women to go up against big technology. We need rules in place for how these algorithms that impact us all get deployed. 

There is a critical, and often missing ethical component to these algorithms. How much of that do you think is because of the folks building the technology not being able to teach AI anti-discrimination, and how much of this is predatory intent? 

I would say that it's not intent, or malicious — but that's what's so dangerous. It's the consequences we don't see; what's in our blind spot. They think these systems will be more fair than humans, but these companies need outside oversight and independent boards. They need to regulate technology as they do every other sector of society. It creates an imbalance of power that's dangerous to democracy.

"Coded Bias" shows multiple, negative examples of facial recognition technology and algorithms being used in law enforcement, from a 14-year-old black youth being unfairly detained in the UK to a Philadelphia woman being asked to report to her parole officer after years of positive accomplishments. Do you think the people in power will change or even think critically about how they use their technology once aware of deficiencies?

Different companies have different reactions. Companies only act when citizens engage and take the streets and they have no other outcome. There should be laws in place to make them respond. 

There is a fascinating segment in the film where you address China's efforts to maintain social order. They create a "social credit score" that "wills people to behave." One interviewee appreciates this service because she trusts people who have a high social credit score more. 

I think that is a "Black Mirror" episode. What that woman is expressing is something we all feel: "Look at how efficient this is." I can see your Instagram followers and Facebook likes and decide how important you are. Including that is a mirror to show how we have grown to love Big Brother in a particular way. 

Isn't it terrifying though?

Having unfettered access to data and having your face be your score? Absolutely! It would have been tricky to put someone on camera saying something against the system [in China] without putting them in grave danger. 

I'm curious in what ways do you think this technology can be used for good? 

Everyone in the film loves tech. I'm a "Wired" fanatic. I made the film because I love technology. The business model and lack of legislation has created huge monopolies. What happened to our antitrust laws? Tech is governing our freedom of expression. Our tech can infringe upon so many rights that our democracy is based on. We have to look at this and not have this Wild Wild West. The business model that allows companies to sell you as the product to everyone else has to change. I want to roll back the curtain on the Wizard of Oz. It's not magic. We should not trust these systems implicitly. We don't have choices. Where can you go to hide from Google search? There are no other options. Our modern lives are lived online even moreso because of the pandemic. We need to have rights around this in a democracy. If we care about having a strong democracy, we need protections for ordinary citizens in this space. We can't choose or opt out of this technology, so we need to demand civic participation that is meaningful online. 

Your film absolutely makes me rethink my social media presence.

Companies are marketing to me from a conversation I had on my phone. We need protections: this can't be OK. This pervasive surveillance makes the COINTELPRO and the East German Stasi look like they have a light touch. Where is this info getting sold? Someone can complete a social profile of who you are. Whether you will have a baby in six months or your sexual orientation. We need to rein this in and have government and laws around it. Senator Orrin Hatch's glasses can't be the highest technological thing in congress.

Some of these issues in your film may not be visible for years. What are your thoughts on the long-range repercussions of this?

Cathy [O'Neill] proposes an FDA for algorithms, a vetting system that this technology does not do undue harm like the pharmaceutical industry has. Don't leave the tech bros alone on these issues. Their own internal ethic boards can't handle this one. When Mark Zuckerberg made the decision that he would not vet political ads for truth, he turned his back on democracy. We want to know what happened in his secret meeting with Trump. He needs to be held to a different standard before this happens. There needs to be a vetting process. Cathy calls it "weapons of math destruction" because these technologies can be deployed on a massive scale and hurt people on a massive scale. We should all be concerned about that. 

We can also be a little bit more suspicious and critical and questioning of the systems we use every day. Why is this coming up in my feed? Some is so insidious and opaque. It's hard to know if we've been denied an opportunity because of an algorithm. I hope the public educates themselves because that is the first step to making change.

That said, I'm actually a little hopeful. Having three massive tech companies stop selling to police is a major step and acknowledges that there is an issue. We still have a chance to make our voices heard. Many of these systems are not set in stone; the cement is still wet. The current protesting is a real moment to say we want a more humane society, and technology is not always the most humane or best solution. We need to look at how humanity gets lost when we prioritize efficiency at the cost of all else.

"Coded Bias" will screen as part of AFI DOCS on Saturday, June 20 for 24 hours beginning at 12:01 a.m. ET. 

 


By Gary M. Kramer

Gary M. Kramer is a writer and film critic based in Philadelphia. Follow him on Twitter.

MORE FROM Gary M. Kramer


Related Topics ------------------------------------------

Afi Docs Coded Bias Facial Recognition Interview Movies Racism Sexism