FRT law yet to catch up

There are currently 16 different facial recognition tracking (FRT) systems in active utilisation by various Central and State governments across India for surveillance, security or authentication of identity. Another 17 are in the process of being installed by different government departments.

While the FRT system has seen rapid deployment by multiple government departments in recent times, there are no specific laws or guidelines to regulate the use of this potentially invasive technology.

This, legal experts say, poses a huge threat to the fundamental rights to privacy and freedom of speech and expression because it does not satisfy the threshold the Supreme Court had set in its landmark privacy judgment in the ‘Justice K.S. Puttaswamy Vs Union of India’ case.

In 2018, the Delhi police became one of the first law enforcement agencies in the country to start using the technology. It, however, declined to answer to a Right to Information (RTI) query on whether it had conducted “privacy impact assessment” prior to deployment of the facial recognition system (FRS).

Advocate Apar Gupta, co-founder of Internet Freedom Foundation (IFF), in his RTI application had also asked the Delhi police whether there are any guidelines, policies, rules or standard operating procedure governing its use of facial recognition technology.

The Delhi police vaguely replied, “The FRS technology may be used in investigation in the interest of safety and security of general public”. In the same RTI reply, the Delhi police also stated that the use of facial recognition technology was authorised by the Delhi High Court.

Function creep

However, advocate Anushka Jain, associate counsel (Transparency & Right to Information), IFF pointed out that the police got permission to use the FRS by an order of the Delhi High Court for tracking missing children.

“Now they are using it for wider security and surveillance and investigation purposes, which is a function creep,” Ms. Jain said.

A function creep happens when someone uses information for a purpose that is not the original specified purpose.

In December last year, the Delhi police, with the help of automated facial recognition system (AFRS), was comparing the details of people involved in violence during the anti-Citizenship Act protests in Jamia Millia Islamia with a data bank of more than two lakh ‘anti-social elements’.

Ms. Jain said: “The function has widened at the back end and we don’t actually know for what purpose they might be using it and how they are being regulated and if there is any regulation at all”.

“This might lead to an over-policing problem or problems where certain minorities are targeted without any legal backing or any oversight as to what is happening. Another problem that may arise is of mass surveillance, wherein the police are using the FRT system during protest,” Ms. Jain said.

If someone goes to a protest against the government, and the police are able to identify the person, then there might be repercussions, she argued. “This obviously has a chilling effect on the individual’s freedom of speech and expression and right to protest as well as my right to movement”.

“This might lead to the government tracking us all the time,” she added.

Proportionality test

Vidushi Marda, a lawyer and researcher at Article 19, a human rights organisation, said the Supreme Court in the Puttaswamy judgment ruled that privacy is a fundamental right even in public spaces.

“And if these rights need to be infringed, then the government has to show that such action is sanctioned by law, proportionate to the need for such interference, necessary and in pursuit of a legitimate aim,” Ms. Marda said.

She flagged various issues with the AFRS, an ambitious pan-India project under the Home Ministry which will be used by the National Crime Records Bureau (NCRB) and various States’ law enforcement departments.

“The IFF filed a legal notice to the Home Ministry asking under what legal basis was the AFRS built, since, as per the Puttaswamy judgment, it does not meet the threshold of proportionality and legality,” Ms. Marda said.

“The basis of the AFRS is a Cabinet note of 2009. But the Cabinet note is not a legal substance; it’s a procedural note at best. So it does not form a valid legal system based on which the AFRS can be built,” she added.

Questionable accuracy

Ms. Jain, who is currently working on Panoptic, a project to track the deployment and implementation of FRT projects in the country, said that 100% accuracy in finding matches has not been achieved under this technology.

“In case an inaccurate system is installed, two things can happen. There can be a ‘false positive’ wherein somebody is recognised as somebody they are not or ‘false negative’ wherein the system refuses to recognise the person as themselves.

In case of a ‘false positive’, she gave an example of the police using the FRT system to identify and arrest somebody who is not the suspect. If a ‘false negative’ occurs when the government is using the FRT system to provide its schemes, then this could lead to many people facing exclusion from such government schemes, Ms. Jain added.

These FRT systems are being developed and deployed across India without any legal framework in place, which creates a lot of problems. If you are caught hold off by the police through the FRT system, what do you do? What are your remedies? There is no framework in place where you can even question them,” she pointed out.

Ms. Mishi Choudhary, technology lawyer and digital rights activist, said, “Many cities and states in the U.S. have either completely banned the usage or impose moratorium on the usage of facial recognition tech”.

“Companies like IBM, Microsoft have decided not to sell these technologies to law enforcement at all. Even Amazon has imposed a moratorium. Facial recognition technology has not only been invasive, inaccurate and unregulated but has also been unapologetically weaponised by law enforcement against people of color,” Ms. Choudhary added.

In India, we have no law to protect people, no guardrails about usage of data by private players or the government. We hear several news on police abuse even without the aid of technology. Facial recognition is a perfect form of surveillance that builds tyrannical societies. It automates discriminatory policing and will exacerbate existing injustices in our criminal justice system,” Ms. Choudhary said.

Mr. Gupta gave a similar view. “India is facing a facial recognition pandemic — one without any safeguards or remedies for the harms of exclusion, profiling and surveillance. Without urgent action, such systems of mass surveillance will erode democratic liberties and threaten the rights of lakhs of Indians,” said Mr. Gupta