SUBSCRIBE NOW

SIGHT

Be informed. Be challenged. Be inspired.

“Big Brother”: Reboot of Buenos Aires facial recognition plan fuels privacy fears

DAVID FELIBA, of Thomson Reuters Foundation, reports that NGOs fear a controversial facial recognition system – dubbed Buenos Aires’ Big Brother – could be revived, threatening rights…

Buenos Aires, Argentina
Thomson Reuters Foundation

After a relaxing weekend away, Guillermo Ibarrola was walking out of a train station in Argentina’s capital when police arrested him and accused him of a robbery committed hundreds of miles away in a place he had never visited.

“It was a nightmare,” Ibarrola told local media after the 2019 incident, which rights campaigners say highlights the risks of using facial recognition systems to survey populations.


Police in Buenos Aires monitor surveillance cameras and apply the Fugitive Facial Recognition System (SRFP). PICTURE: Buenos Aires City Government/Handout via Thomson Reuters Foundation.

The system of 300 cameras linked to a national crime database – dubbed Buenos Aires’ Big Brother – was suspended two years ago after a court found it may have been used to collect data on journalists, politicians and human rights activistsand ruled it unconstitutional.

Now city authorities, who denied the allegations of abuses, want to reinstate the system but rights experts warn it poses a threat to citizens’ rights and is prone to errors because it relies on a low-quality database.

“The system violates many rights of individuals such as the right to privacy, to assembly, and to move freely. This software may indeed be a good solution for spotting fugitives. But at what cost in terms of personal rights?”

– Matías Otero, founding member of the Argentine Computer Law Observatory.

The Fugitive Facial Recognition System (SRFP) was introduced in 2019 to screen live feeds of individuals in key transport hubs, searching for potential matches between the images and identities in the CONARC database, which is run by the federal Ministry of Justice.

This database contains details of about 40,000 people who are wanted by the police for various crimes.

In its 2022 ruling, the court said there were multiple inconsistencies and errors in the system, adding that that city government had failed to comply with legal requirements to protect residents’ personal rights.

The NGOs that won the legal suspension in 2022 say city authorities have ignored multiple requests to provide details on how the system would work, and where it would be used.

Ibarrola’s ordeal was the result of a data entry error – somehow, his ID number was entered into the database instead of that of the fugitive criminal, who had the same name as him. And so police were alerted when he stepped off the train.

Ibarrola, who was 39 at the time and worked in a chicken processing factory, was whisked away to Bahia Blanca, the town where the robbery occurred, almost 650 kilometres south.

He spent nearly a week in custody before authorities realised their mistake and released him.


An aerial view over Buenos Aires. PICTURE: Andrea Leopardi/Unsplash

Ibarrola’s case is just the most dramatic of tens of alleged wrongful arrests, compiled by rights groups before the surveillance system was declared unconstitutional.

“The system violates many rights of individuals such as the right to privacy, to assembly, and to move freely,” said Matías Otero, founding member of the Argentine Computer Law Observatory, or ODIA, one of the NGOs that brought the case.

“This software may indeed be a good solution for spotting fugitives. But at what cost in terms of personal rights?”

The issue has become a talking point again as city authorities seek to reactivate the surveillance system, which was turned off during the COVID-19 pandemic because mask-wearing reduced its efficacy.

In February, a court ruled that the city government and rights groups must reach an agreement on how to audit the system before its reintroduction – for which no date has yet been set.

“Facial recognition is still too unreliable to be deployed freely,” said Beatriz Busaniche, director of Via Libre Foundation in Buenos Aires, which was part of the 2022 legal case against the SRFP.

“A sophisticated banking app may struggle to detect a person’s face with a white background, good lighting, and a smile to the camera. Now imagine someone moving on the street, looking down at their phone, walking quickly, and in low light. How reliable can this be?”



The issue has become even more pertinent after the government of President Javier Milei, a right-wing libertarian who took office in December, floated a proposal to use facial recognition to identify those protesting against his austerity policies and cut their benefits.

The justice and security ministry of Buenos Aires, which oversees the surveillance system, did not respond to multiple requests to comment.

Global use of facial recognition has grown rapidly, despite its poor track record identifying women and people of colour. Aside from solving crimes, accessing apps or policing borders, it is also used to target dissidents and activists from Iran to China.

Across Latin America, governments are increasingly turning to surveillance technology to prevent crime, often without informing citizens of the scope of these measures.

In Brazil, for example, facial recognition operates in 21 of 27 states, be it for monitoring urban transport or tracking school attendance.


The Fugitive Facial Recognition System (SRFP) screen individuals in a Buenos Aires subway station. PICTURE: Buenos Aires City Government/Handout via Thomson Reuters Foundation.

Buenos Aires’ city authorities have said their system – contracted to the Buenos Aires-based company Danaide SA – led to the arrests of more than 1,600 fugitive criminals.

They conceded that some errors were made, including Ibarrola’s arrest, but blamed these on the judicial databases, rather than the facial recognition software.

Buenos Aires’ ombudsman, María Rosa Muiños, said necessary security and transparency checks were not carried out when the system was originally installed.

“The design of [facial recognition] software invariably exposes biases, and its use involves the processing of large amounts of sensitive data – for example, people’s race, ethnicity or gender – which can become powerful discrimination devices,” she said in an email to Context.


We rely on our readers to fund Sight's work - become a financial supporter today!

For more information, head to our Subscriber's page.


Cities, including Boston and San Francisco, have banned facial recognition software due to privacy concerns and potential biases.

“The problem with facial recognition software is that there is just a lot of room for abuse,” said Paul Bischoff, an editor at Comparitech, a UK-based cyber security and privacy research website.

“You don’t know who’s looking at the cameras, what government agencies have access, who they share information with or whether the next administration will use them correctly. To use this sort of technology, it needs to be extremely regulated.”

NGOs have also requested that a public awareness campaign be carried out in Buenos Aires before the system is reactivated and the city authorities are now holding seminars to inform the public about the SRFP as part of this process.

“Privacy is not a right you lose the moment you step outside your doorstep,” said Busaniche. “Even in public spaces, citizens have a presumption of privacy as long as they haven’t committed any wrongdoing.”

Donate



sight plus logo

Sight+ is a new benefits program we’ve launched to reward people who have supported us with annual donations of $26 or more. To find out more about Sight+ and how you can support the work of Sight, head to our Sight+ page.

Musings

TAKE PART IN THE SIGHT READER SURVEY!

We’re interested to find out more about you, our readers, as we improve and expand our coverage and so we’re asking all of our readers to take this survey (it’ll only take a couple of minutes).

To take part in the survey, simply follow this link…

Leave a Reply

Your email address will not be published. Required fields are marked *

For security, use of Google's reCAPTCHA service is required which is subject to the Google Privacy Policy and Terms of Use.