The perils of facial recognition surveillance systems
In 1997 my mother took me to see the move The Saint in the theater. I was astounded by the way the main character, an orphan turned international man of mystery, Simon Templar, confused the global authorities by changing his appearance and stroll through customs with priceless loot literaly tucked under his arm.
The movie depicted several scenes of police officers comparing several photos of his disguises, baffled at the vast differences between them and the surveillance society still banks on this powerful image of people interacting with people in order to decipher truth from fiction.
Fast forward twenty-five years and Simon Templar wouldn’t even be able to walk up to the terminal. Facial recognition technology made a few giant leaps since 1997 and we are now facing a global surveillance machine, powered by articifial intelligence, machine learning, vast databases which can be used in the processes of algorithm trainings. There’s no more room for Saints.
Amazon Rekognition: The all-seeing machine
The reality is several times removed from fiction as people no longer play the most vital link in the chain of surveillance society. The machine virtually took over the game and we are taking the back sit, playing the role of an unwilling data set that is compressed, expanded, analysed and processed by the machine.
In order to understand the differences between the way people see the machines and how machines see us, I tried and figure out the best way to beat one of the best facial recognition systems in the world – Amazon Rekognition.
Rekognition is a powerful cloud-based software as a service (SaaS) computer vision platform that was launched in 2016. It can we used literally for free by any internet user with a credit card that in one case is used only to verify the user identity, while the use of service itself is free.
Amazon Rekognition is currently used by several of the law agencies around the world, in several workplaces and public spaces like stores and other venues. Amazon itself tells about user cases in several industries, from fishing to healthcare, carefully omitting the police departments and other security-related companies.
It does however briefly tutors said actors on the correct usage of facial recognition technology in the developers guide, warning the potential users about possible misuse of technology and the need for human supervision of technology.
Biometric surveillance: A political question
Facial recognition technologies have been in the public spotlight for the past few years, with researchers, journalists and activists highlighting its many problematic issues.
The Netlifx documentary Coded Bias for example focused on the insufficient technological solutions that were not able to do their job properly – mismatching faces and misidentifiying people which caused the Amazon to put a moratorium on the police use of its facial recognition technology, following the pressure from the public and US political representatives.
The EU-based Civil society initiative Reclaim your face also warns about the perils of entering a total surveillance society where the facial recognition algoritms are perfected in a way that enable facial recognition systems to effectively target every single person on this planet and then follow that person everywhere it goes.
“One of the issues with these biometric surveillance systems is the idea that we can tackle any social problem by using the right technological solution,” ponders junior research fellow at the Institute of Criminology at the Faculty of Law Ljubljana, Pika Šarf, who asks the question: “Can facial recognition really achieve a certain goal in terms of improved security and policing, and if we can, is this solution really the least invasive measure that can achieve the same aim?” and highlight the lack of reports on the successes of biometric surveillance systems. “If we examine the large centralized data bases which are used to monitor travel flows in and out of EU, we can note that the EU rarely provides the data on how many times were these systems able to intercept a “problematic individual” from crossing the border and how many criminal cases did they thwart with it,” she notes.
Ms. Šarf also mentions the question of terrorism prevention on the EU borders where surveillance systems focus on the usage of biometric data to prevent certain individuals from entering the EU. “How will these systems working together under the umbrella of interoperability decide on who is allowed to enter and how will they filter out “problematic individuals” is anybody’s guess at this point,” she says. At the same time, she notes the general persuasion that “biometric data is supposedly infallible and therefor nothing can go wrong.”
The Amazon moratorium and the civil society initiative focus on the political side of the argument for banning and prohibiting a mass use of biometric surveillance systems. Activists, politicans and active citizens are calling for a legal solution of this issue that would encompass several of the current and possibly future systems of biometric surveillance.
Surveillance fighting power user
On the other side of the spectrum people are gearing up to fight the system themselves. You can literally find thousands of tutorials online that teach the art of avoiding facial recognition systems, from applying AI-confusing make-up, wearing masks and other pieces of clothing to literally putting a bag over your head.
These attempts are dangerous for several reasons – firstly they broadly misinterpret the issue where the facial disguises are usually tested on one of the several facial recognition algorithms at one point in time. Further more the personal angle and engagement are usually reserved for a particular type of people in mind and would not be feasible as a general solution to the issue. Thirdly, the biometric surveillance systems are a process and not a state – they are constantly developing and upgrading and therefor require even more vigilance from the users.
“It all comes down to data set and their quality,” explains assistant dr. Žiga Emeršič from the Faculy of Computer and Information Science at the University of Ljubljana.
He is mainly focusing his attention at biometry, deep neural networks and computer vision. “For a working facial recognition system, you have to pay attention that the data includes obscured faces, photos of faces in low resolution, different angles of shots… all of this matters when you are training a facial recognition algorithm,” he says.
His faculty colleague, assistant Blaž Meden who is focusing on the field of facial recognition models anonymization, explains further. “Models of neural networks that are used in factial recognition systems are constantly changing. In my work where I try to “break” the faces in a way that they’re still recognizable by humans but not by machines, I find it increasingly difficult to do that successfully,” says Blaž Meden, “since the facial recognition technology is getting better and better.”
In order to better understand the nature and abilities of the Amazon Rekognition facial recognition surveillance system, I’ve put together several sets of profile photographs that hide, obscure and change several of my facial features in order to figure out the best way to avoid detection and confirmation of my identity.
I used several wigs, different eyeware plus a set of surgical masks and different shapes I covered my face with. I compared each of the “masked” photographs with a clear show of my face and tried to figure out the best way to “cheat” the algorithm.
Although limited in scope, I put together a set of 36 photographs all together, taken on a white background with sufficient lightning as seen below. Here are my notes on the sub-sets, highlighting the difficuly of facial recognition algorithm avoidance.
Hair does not make a difference
The sets with different wigs did nothing to change the way the Amazon algorithm saw me. Despite the fact that several of the wigs were covering top part of my face, the Amazon saw right through me. Even the removal of facial hair did little to change the way algorithm saw my face.
Glasses help – a little.
A combination of wigs and glasses proved more effective, but generally speaking still did not give me an edge in algorithm avoidance. Despite the fact that several of the wigs completely hid my natural hair and despite the fact that I shaved off my beard and moustache completely, the match was almost 100% in most of the cases.
Obscure is secure
The best results came from a combination of wigs, glasses and removed facial hair. As you can see on the examples below, the algorithm failed to recognize me in both of the photos, although it successfuly detected a human face on both of the examples.
Hiding my face behind a wig, sunglasses and mask seem to be the best solution at this moment if you want to avoid detection.
This of course means that the “Saint” method of trying to alter your facial features does not work. In fact, the algorithm found Val Kilmer in every masked face attempt that was featured in the movie The Saint. What’s even more interesting, algorithm was able to find Val Kilmer when masked images from the Saint were compared to Kilmer’s parts in other movies (Tombstone).
Interestingly enough, Val Kilmer succesfully avoided detection with his Batman mask. Although the mask does not cover the entirety of his face and you can see his typical lips on both of the images, the algorithm did not see the resemblence. So the Saint is busted, but maybe we can all become a caped crusader solely for the purpose of retaining a shred of our privacy?
“Successfully avoiding detection relies on the way you distrupt the facial recognition pattern engine and they way the engine learns to ignore certain parts of your face,” explains Žiga Emeršič. “For example, covering your forehead does not do much, because the algorithms are usually trained on other parts of your face that are usually not covered by headwear,” he says, adding “the best way to avoid detection is therefor not to change your face and try to trick the algorithm that you are somebody else, but to obscure your face so that the algorithm is not getting enough data to perform its analysis.”
However, covering your face only works for facial recognition systems, warns Žiga’s colleague, researcher and PhD student Matej Vitek, who is working in the field of biometric surveillance systems that are focusing on parts of the human eye. “Even in cases where you would cover your face completely, there are other biometric ways to detect a person, using the gait analysis and other biometric data sources,” he explains.
But even obscured faces do not mean that the solution is bullet-proof as the Amazon Rekognition algorithm managed to identify me in almost 90% of the masked attempts. So before you start shopping for the items and mimic my best attempt, read on. This is not the solution.
The surveillance data economy is based on ubiquitous, unregulated and private technologies and services that gather, analyse, share and sell the datasets. In the case of biometric surveillance these data sets are actual people – their physical features, their shapes, their physical identities.
The issue of regulation currently still hinges broadly on the issues of the biometric surveillance technologies not being able to do what they are claiming to do – by extrapolating, we are constantly seeing the regulators to tell the industry “Do better!”. What they should be saying is “Don’t do this at all!”
Currently the biometric surveillance is regulated with the personal data protection legislature, junior research fellow at the Institute of Criminology at the Faculty of Law Ljubljana, Pika Šarf notes.
“But the whole field is somewhat fragmented – commercial use of biometrics is regulated by the GDPR, the use of the same technology by the police sector falls under the auspices of Law Enforcement Directive, while the employment of national securities agencies is not regulated on EU level at all. Even the strictest of the data protection regimes – the regime of GDPR – leaves some leeway for member states, for example regarding the use of personal data for research purposes,” she adds.
“The upcoming Artificial Intelligence Act that the EU Commision is putting forward as a way to amend the field of artificial intelligence is also focusing on the issue of biometric surveillance which is deemed the most dangerous,” Pika Šarf explains, “however, the framework allows an exception in the police usage of biometrics, which is, to be honest, the most invasive one.”
She would feel more comfortable “with a moratorium on police usage of biometric surveillance technologies during which time we could test and prove its effectiveness.”
In Šarf’s opinion, the current regulatory frameworks try to regulate the imprecise and extremely precise systems of biometric surveillance.
“One of the core data protection principles puts special attention to the quality of the data – what exactly does that mean in terms of low and high quality of the used data still needs to be interpreted,” she notes and adds: “the quality of the data is only half of the picture, the other half is the design of the AI behind facial recognition technology.
Also worth noting in Šarf’s opinion is that the current proposal Artificial Intelligence Act suffers from an abudance of exemptions.
“We can trust the opinion of EU Commission that the field of biometric data is sufficiently regulated with the GDPR and there is no need for additional regulatory measures, but there is another possible explanation: that the EU Commission does not want to exclude the possibility to further develop its biometric surveillance and give up their already established mass surveillance systems that costed billions of euros in taxpayer money,” she comments.
As our experiment show, we cannot expect the general population to test the algorithms, expect privacy invasions in every single instance of public and private lives and respond to them accordingly. Furthmore, we cannot expect this issue to be resolved between the user and technology owner, since we can clearly see the technology overreaching the capabilities of a single user.
Another important topic to regard while discussing the biometric surveillance is the commercial part of the endeavour. Biometric data sets are widely available, even for free. Tools of biometric surveillance are accessible as well, sometimes for free, sometimes for a cost so low it does not really registers.
“Some fields of biometric surveillance are developing faster because the amount of data sets which can be used to train the algorithms is bigger,” says Žiga Emeršič, explaning that the detection of ear-lobes lags behind other fields precisely because the ear data sets are not as widely available as other parts of the face. That is not the case with broad facial biometric algorithms, adds Blaž Meden, “since there’s already a huge number of facial biometric data bases available.”
What’s even more, “the quality of the facial biometric data is very good, since you have specific data sets that deal with expressions of human emotions, with the way people move through a space, you have data sets that were taken in a photo studio and in the wild…” says Meden.
“It’s worth noting the black box issue with these biometric surveillance systems since nobody really knows which data sets are being used by Amazon and other actors in this field and ample research shows that facial recognition systems are biased against people of colour,” notes ms. Šarf as she mentions several datasets that are freely accessible to train biometric algorithms.
Searching for free facial biometric datasets online confirms these claims as one can quickly find several facial biometric databases with tens of thousand photos of different faces one can use to train the recognition algorithms on. At the same time there are also several free online tools available that one can use to compare images and match individuals based on biometric datapoints.
Futhermore, the popular tips on how to “deceive” algorithms prove to be highly problematic as well, for several reasons.
First of all, these tips usually come from the artistic manifestations and are done without a clear scientific basis. When we ran these supposed bullet-proof solutions through Amazon Rekognition software, it successfuly detected a human face, although the authors claimed they found a way to avoid facial recognition.
Žiga Emeršič explains: “These tactics are very facial recognition system specific and are therefore very prone to failing, since the facial recognition models are constantly updating its detection and analysis methods.”
Second – the perception that the user is pitted against biometric surveillance engine and cannot rely on anybody else but themselves, leads to privacy fatigue and is impossible to execute in real life. Unless of course you want to walk around with a paper bag on your head and think that is completely normal.
Third – you have to understand that facial recognition systems are a process, not a state. They are constantly changing and updating, its authors inovating new ways to overcome attempts of avoidance. This is especially relevant because of highly commercialised field of biometric surveillance that is literally accessible to everybody.
At the same time ms. Pika Šarf notes a problematic public discourse in the field of biometric data and surveillance where some issues do not reach the public forum. “Generally speaking I would not say the general public is against use of biometric systems,” she says, “and people do not see the problem in giving up a little bit of privacy for convenience.”
She notes the use of biometric tools to access smart phones and digital services are normalizing the use of biometric data without any apparent threat to user’s privacy.
However, we have to differentiate such use from facial recognition being employed by law enforcement authorities, which present a much graver threat to our privacy and data protection.
“At the same time it is very hard for a scholar or an activist to argue against these issues with biometric surveillance since there is little to no data on the successfulness of them,” she adds.
If we examine the wider field of data protection in the information age, we can see how important are smaller groups and individuals that are paving the road towards a better protection.
Max Schrems, Patrick Breyer and other activists are on the frontlines of battles with global data intermediaries and other privacy violators but shifting the entire burden on effective and useful data protection regulation will not pan out in the long run, warns Pika Šarf. “You cannot expect that these people will remain vigilant and will focus on every single data protection violation,” she explains.
Biometric surveillance is becoming more and more prevailent and commercialized. At the same time there’s very little if anything that can be done from the point of the user, as we have shown. Technology is a political issue that needs to be addressed as such. Unless of course we are comfortable with the world where the only thing protecting your biometric privacy is a paper bag over your head. Or for the time being – a Batman mask.