Photo by Joseph Chan on Unsplash
Across the world people are resisting facial recognition. In the UK, pedestrians are covering their faces. In Hong Kong, protestors use umbrellas to block their images and use lasers to disrupt cameras. In the United States, residents of an apartment building are resisting efforts to track their movements and actions by video.
What we are witnessing is something out of a dystopian movie where our faces are cross referenced against police databases every time we step out into public view. In the Netflix documentary ‘Coded Bias’ we learn how inaccurate these systems are when analyzing people of color in particular.
In a study by Joy Buolamwini, an MIT student, we witness Buolamwini attempt to create a motivational avatar for her computer screen by using Venus Williams image. The system struggles to identify the image, but as soon as Buolamwini puts on a white mask the system instantly functions.
AI was initially created in Dartmouth, in 1956, by a group consisting of white males. Essentially machine learning learns what it fed to the machines. Once the system is fed it becomes a black box that the programmers themselves cannot predict, yet many institutions and government agencies rely on this black box to predict credit worthiness, likeliness to reoffend, and to identify wanted individuals.
Organizations use this as faux system of neutrality. As this technology becomes more widespread western society devolving and beginning to resemble programs used in China that determine a sort of social credit score where people have their whole lives boiled down to a number. A number that decides how you rate as a human.
In one scene in coded bias we see a government surveillance van capturing images of normal morning pedestrians and cross referencing them to a database of known offenders. When one individual dares to resist the operation by pulling his shirt over the lower portion of his face, he is immediately intercepted by officers to establish his identity. Worse, we see a 14-year-old black male intercepted by officers when his face is analyzed and determined to be a match for a person of interest.
The young man, in school uniform is pulled aside and digitally fingerprinted to confirm the officers’ findings. Without reasonable suspicion or probable cause other than what the black box system told them the young man is detained as his friends standby and watch. When the officers positively identify the young man, they realize that it was a false hit and the young man is sent on his way.
In the United States, we have the fourth amendment where citizens are protected from unlawful search and seizure. We have a right to privacy, but without federal regulations our very constitutional rights are being trampled on with the advancements of algorithms that are designed to predict future behavior. Historical data collected on these algorithms has determined that they are less accurate then a coin flip. But they offer authorities deniability. Further, they remove liability for courts and criminal justice systems.
No matter how inaccurate they prove to be, these systems are still in use and the ramifications are mandatory detention for arrestees, stiffer parole supervision, and the basic human right to be judged by a jury of their peers. The algorithm is programmed to take into consideration an individual’s income, friends and associates that are unemployed, and what type of a household they grew up in. The catch 22 in this is that you can’t blame the system, it’s a computer and a computer can’t be racist. It is viewed as an impartial system, but it is anything but.
While there are positives that have come from facial recognition, like identifying the Annapolis shooter, the minimal success stories do not justify the mass scanning of the presumed innocent masses. Authorities cannot eschew the constitutional rights of millions under the guise of security, but that is just what they are doing.
“Those who would give up essential liberty, to purchase a little temporary safety, deserve neither liberty nor safety.”
In Communist China, the deep state surveillance system and their social credit score is not something that is used in the dark, but rather a well-publicized tool. The oppressive government is basically letting their citizens know, you are being watched and judged, act accordingly. As Buolamwini, opined, it’s basically an algorithmic obedience tool. In the west, the violation is subtler. It’s in your smart phone, capturing every conscious stream of thought that you take to google with.
It’s in your Amazon Prime account tracking your every purchase and later making recommendations for future purchases. It is employed under the guise that is making your life easier. Simply submit your face to the app and it will open for you without having to punch in your code. Systems like this work well in our ever-increasing world where everything is at our fingertips. However, our images have been captured by big business: Apple, Meta, and other social media apps. We have voluntarily submitted our faces to big tech without reading the fine print.
Without checks and balances put in place, western society will soon resemble an Orwellian society. Facial recognition is big business. Outside the obvious law enforcement uses we have the potential commercial uses. For vendors, they could use facial recognition provided by social media companies to learn about customers physically entering their stores.
Data could be used to determine a person’s shopping history, travel, and credit worthiness. It has all been voluntarily submitted and captured via social media. With this form of corporate surveillance people could be tagged and categorized. What you share with family and friends can be monetized for your targeting.
In a positive turn of events some organizations recognize the bias within these programs and have taken action. When Buolamwini spoke at IBM and showed the issues surrounding their platforms, the company made the changes to a more accurate inclusive system. Their changes were made voluntarily, out of good will, but what about less scrupulous organizations. Without regulations in place these technologies will grow. Some will exist openly while others will lurk in the dark.