‘Coded Bias’: Changing Code for the Better
“I am not trying to say AI is good, [or] AI is bad but that AI is powerful”. - Director Shalini Kantayya.
Coded Bias is a documentary that is focused on injustice within technology. MIT student Joy Buolamwini was trying to complete an assignment for her engineering class where a mirror would place an inspirational person’s face onto yours, much like a filter that is used on Snapchat. When she was testing it out on herself, she found out that her camera couldn’t detect her face. So she tried again but this time with a white mask instead. Success.
Within the documentary, they mentioned that even the history of AI has been primarily white, therefore, making progress harder to achieve. Three major companies that use face recognition couldn’t detect women with a darker complexion. Buolamwini had reached out to these companies and worked with some to make a change for the better.
It’s not only powerful companies that use face recognition, but also powerful countries too. In the UK, police have used facial recognition cameras on the general public. They have been using face recognition to find ‘criminals’ when, in reality, they have been traumatizing its citizens. Along with the UK is Hong Kong. Hong Kong has been using face recognition to keep its population obedient. They have to use their face for daily activities like boarding a train, buying groceries, or even using a vending machine.
However, many citizens of Hong Kong had enough. They started a rebellion. They would use masks to cover their faces, helmets to cover their hair, and even laser pointers to confuse the cameras. The reason for the rebellion: the government would reward its citizens for being obedient, and take away from their “score” if they misbehaved. People shouldn’t be controlled or restricted for being disobedient. I believe that the director did a great job shedding more light on this important issue.
I am proud to say that this documentary was one for the history books.