Expert concerns about the Spanish police’s new facial recognition system: mass surveillance, identity loss and data erasure | Technique


This year the National Police and the Civil Guard will start using the automatic facial recognition system in their investigations, as reported by EL PAÍS. This is ABIS, a tool that uses artificial intelligence algorithms to determine in a few seconds if any photo contains a face containing records (in this case, of people with a dash profile). technology is already ready; All that remains is to complete the integration of the database with which it will start working, according to sources from the Ministry of Interior. Once this process is completed, which is expected to happen in the first half of 2023, it will be possible to start using this computer program, which was developed by the French military company Thales.

But its application raises several questions, especially with regard to the transparency and control mechanisms required for tools that work with biometric data. The Spanish Agency for Data Protection (AEPD) itself has been examining its suitability in the legislative framework since September, and must determine whether or not it violates any right.

This internal paper asked about the ambiguous or unclear points of ABIS, which were discovered with the help of many experts, among them doubts about its transparency, the way databases are shared and managed or how the use of a proportional system will be ensured. The Ministry has refused to provide additional background information to dispel these doubts. These are the issues that concern the engineers, analysts and activists consulted.

1. The shadow of mass surveillance

Police use of automatic facial recognition technologies is challenging because it allows for the unambiguous identification of individuals. The system is able to detect human faces in digital images, whether from a mobile phone or from security cameras, and extract a unique and unmistakable pattern from each person’s features, just as it does with fingerprints or DNA. Although there is an important difference: while in the latter two cases it is necessary to establish physical contact with the infected person (to extract his fingerprints or a sample of saliva), with facial recognition everything can be done remotely.

This makes it an ideal technology for mass surveillance. Beijing has known this for years. The streets of large Chinese cities are full of cameras equipped with these systems. Authorities can locate any citizen in minutes by searching their faces in real time. There is no possible escape.

The European Union prohibits the use of real-time facial recognition in public places. The interior design completely excludes the use of ABIS in surveillance work rather than investigations. However, the ministry does not say how or who will control how the tool is used. “We need to know how often the system will be used: only for particularly serious cases or for any investigation? If it spreads, says Carmela Troncoso, professor at the Federal Polytechnic School of Lausanne (Switzerland) and author of the secure tracing protocol used in apps to track infections from Covid. Used widely, it can almost inadvertently become a mass surveillance tool.

2. Anonymity and algorithmic control

The expert is also suspicious of the anonymity promised by Thales regarding the management of ABIS biometric data. The French company explains that “the information stored is based on alphanumeric data that makes it impossible to identify the owner of his fingerprints.” The point is that if someone were to steal this database, they wouldn’t be able to associate the faces stored there with their identities.

“This is a bit questionable. Just because representations are alphanumeric does not mean that they cannot be reconstructed, because there is evidence to the contrary: images can be reconstructed from models. The question is what studies have been done to prove these statements,” Troncoso abounds.

Regarding the operation of the algorithm developed by Thales, both the company and the internal assert that it “passed the NIST Vendors Test,” which is an independent company with a non-commercial scope. “It’s like not saying anything,” Troncoso replies. “They have a scale, but they don’t say if it’s good, bad or regular. What is the algorithm’s score on this test? Is it appropriate for the proposed use case? Who would care to check if it continues that way over time?” This newspaper, the Ministry of Interior did not respond to these questions.

The automatic facial recognition system identifies a group of people in China.
The automatic facial recognition system identifies a group of people in China.

3. Enter and exit the database

The other big concern surrounding the use of this technology is who it will be applied to. According to what the Interior Ministry told EL PAÍS, the database against which the images will be compared contains about five million facial photographs of detainees and suspects who were already on file with the National Police, the Civil Guard and other regional bodies.

How long is the suspect registered in the database? What happens when a suspect stops, for example, by declaring his innocence in court? Have you been removed from or remain in the database? The ministry preferred not to answer these questions either. “Cleaning these databases will be problematic. It is necessary to update them by exchanging them with the databases of other institutions involved in the process, “explains Lorena Jaume-Pallasi, expert in ethics and legal philosophy applied to technology and advisor to the Government of Spain and the European Parliament on issues related to artificial intelligence. . In other words: the Ministry of the Interior has to coordinate with the Ministry of Justice and exchange information, which is unusual in this country or in other neighboring countries. “There are at least two problems here: on the one hand, you don’t have the infrastructure to be able to deploy the system both nationally and internationally, and on the other hand, you have a problem. Mandatory between non-cooperating institutions, ”adds the researcher.

Remaining in this database means being able to be identified as a suspect in any crime. Due to the failure of the algorithms, the agents who screen the candidates provided by ABIS in the search can also be wrong.

4. With whom the data is shared

Interior intends to share the facial biometric data it stores with its European partners. Thales sources explain that “ABIS in Spain can connect to community databases, such as Eurodac, EU-Lisa or VIS”. As the Ministry confirmed to this newspaper, the databases operated by the police will be completely separated from the civil ones (for example, those containing identity photos). “But if ABIS is integrated into the EU-Lisa system, it will do so with asylum seekers who have not committed a crime and are not criminals,” says Javier Sanchez Mondero, Beatriz Galindo Researcher in Artificial Intelligence from the Department of Computer Science. and numerical analysis of the University of Cordoba.

The recent overhaul of Eurodac, the European database of fingerprints for identifying asylum seekers and irregular border crossers, removes the need for a court order so that the police can make an inquiry. If the facial records collected by the security forces are kept in the same database, it will be possible to search automatically without going through a judge. “It is important to understand what are the limits of data management and why. Our past experience is that once these systems are launched, they never stop growing,” he adds.

5. Why is biometric data collected?

The question many experts ask is why do we need this tool. Has it been studied that the potential benefits of applying this system will outweigh the potential problems it generates? The Ministry of Interior confirms that ABIS will greatly facilitate the work of the police. You will be able to quickly identify suspects from crime scene photos that might not otherwise be located.

But the implementation of the automatic facial recognition system is more than that. “You will not make the process more efficient, but you will change the process itself,” Jaume-Palassi sums up. “It will take good servers, backups, a lot of energy and retraining the professionals who work with these systems in the police, among others. Moreover, they are systems that cannot function well because the basic idea, the methodology itself, is bad. Identification of people based on any biometric category always means failure, it is necessary to use other methods and evaluate other aspects.

“No set of algorithms is able to contain the multidimensionality necessary to include all the parameters necessary to identify a person,” continues the expert in digital ethics. “The process of identifying a person from their biometric data is really problematic, based on eugenic ideas. Skin color, for example, is a continuum, you can’t give an exact number of categories. This, on a technical level, is problematic. Because systems need to define variables, this rule applies to a large number of facial features, such as the opening of the eyes and mouth, the size and shape of the nose, and so on.

The more sensitive information accumulates, the more likely it is to be misused. Yusuf M says Boys, coordinator of AlgoRace, a group that investigates the consequences of racist use of artificial intelligence: “We think it’s not really necessary to collect biometric data, because it violates basic rights and is an invasion of privacy.” . There is a huge gap in the data on these systems, which have not been audited. They are betting on securitization, and they are going in the opposite direction to what civil society is demanding. And they have to deal with the state’s obsession with getting a lot of data on us without knowing very well what they’re going to do with it,” he adds.

“This technology has huge potential to be something dangerous, and we don’t know what its true potential to be useful is,” Troncoso snaps. We have no guarantee that it will not have serious consequences. Its use is justified for the sake of efficiency, but what or how much will we gain? Do we have any kind of evidence about that? We are talking about implementing a technology that applies to all Spaniards, in which you cannot decide not to participate and in which we do not know whether we win or lose.

You can follow country technology in Facebook s Twitter Or sign up here to receive The weekly newsletter.

Subscribe to continue reading

Read without limits



Leave a Reply

Your email address will not be published. Required fields are marked *