Site icon Youth Ki Awaaz

Why Facial Recognition Can Have ‘Serious Consequences’ As A Policy Tool

Artificial Intelligence (AI) has received significant interest from the private and public sector in the past few years. NITI Aayog published a national strategy for AI in 2018, recommending investments in research, building an AI workforce, and creating a supply chain ecosystem. The 2020 Union Budget also highlighted Machine Learning and AI, allocating ₹8,000 crores to set up a National Mission on Quantum Computing and Technology.

This blog examines the impact of one application of AI in the public sphere: facial recognition technologies (FRT). Unfortunately, FRTs currently have technological limitations; applications by the state can lead to serious consequences for mistaken identification. Beyond this, the use of FRT has deep implications for the relationship between citizens and the state.

Facial recognition is increasingly being used by governments across the globe, with the global market predicted to stand at USD 7 billion by 2024, and the Indian market alone predicted to reach USD 4 billion by 2024. While a component-wise budget breakdown of how the Government of India (GoI) is using FRT was not accessible at the time of writing, a deeper conceptual understanding can be had.

Representational image.

How Does Facial Recognition Work

Facial recognition algorithms typically rely on Machine Learning, converting images into patterns readable by computers, and matching patterns against a target database. The algorithm learns how to create and match patterns by being trained with a test database with a large sample set; the test database is usually pulled from existing datasets, for example, photographs from online sources. While using the trained algorithm, the program is applied to a database to match a target photograph (for example, matching a screengrab from a CCTV camera against a database of registered criminals).

Understanding the underlying technology is critical to recognising technological limitations at this stage.

Multiple studies have demonstrated that existing algorithms have high inaccuracy rates, incorrectly identifying persons of colour, females, and non-binary individuals (individuals who do not identify as exclusively male or female). The inaccuracies stem from limitations in the status of FRT today, as well as racial and gender biases inherent in the databases used to train algorithms.

Examples of incorrect identification abound: Google Photos tagged African American individuals as gorillas, Amazon Rekognition identified 28 members of the United States Congress as criminals, and a Massachusetts Institute of Technology study of three commercial systems found error rates of up to 34% for women of colour, an error rate 49 times higher than for white males.

Simply put, the technology is not up to the mark yet. While this may not have significant consequences when trying to use FRT to unlock your phone with your camera, it can have serious consequences when applied to surveillance by the state or as a policy-making tool.

Facial Recognition In India

One of the earliest uses of FRT by the GoI was to locate missing children; however, the project has had accuracy rates of less than 2%. Moreover, the Ministry of Women and Child Development testified in court that FRT was unable to even distinguish between genders while tracking missing children.

The use of FRT for missing children may have opened the door for use in other contexts; as of today, facial recognition is used by police forces in Delhi, Mumbai, and Telangana, and is being trialled at airports. More recently, facial recognition tools were used to identify and arrest people during the Delhi riots.

The most important development has been GoI’s announcement of intent to build the world’s largest facial recognition database by 2021.

The Request for Proposals (RFP) for the national Automated Facial Recognition System (AFRS) has an estimated budget of ₹308 crores, outlining the use of a passport, criminal, fingerprint and ‘any other’ databases. The terms of the RFP state that bidding contractors must have implemented three similar projects globally, with a database size of at least 1 million, and have an annual turnover of  ₹50-100 crores in the past three years (details herehere, and here).

These clauses make it unlikely that an Indian company will win the contract, given the size and past experience of Indian companies in the FRT space. This raises the spectre of FRT algorithms being trained on western-centric databases being used in India with higher inaccuracy rates, in addition to other issues outlined above. 

Sanaya is a Senior Research Associate at Accountability Initiative.

Featured image: EvolvingScience.Com
Exit mobile version