DEHRADUN, Uttarakhand — When the Punjab Police's Organised Crime Control Unit (OCCU) stormed a hotel room in Tarn Taran district last year to bust a drug deal, one suspect got away.
But a photograph from a phone recovered from the scene was enough to track him down.
"We looked at their phones, we saw a photo—a group selfie they had clicked," said Nilabh Kishore, Deputy Inspector General of the Indo Tibetan Border Police (ITBP), who was formerly in charge of OCCU.
Kishore's team cropped out the image of the missing man and ran it through the Punjab Artificial Intelligence System (PAIS), which uses an artificial-intelligence assisted face-recognition algorithm to match any photograph against an ever-expanding database of over 100,000 criminal records maintained by the state police.
It was also PAIS that found a match which, Kishore said, helped the police track down Vicky Gounder, an alleged gangster who was shot dead by the police earlier this year.
"The criminals are shaken up now," Kishore said. "When a constable can just point a phone at you and identify you along with your aliases and known accomplices, it strikes a note of fear in them."
A big data revolution is sweeping through police departments across India, as state police forces tie up with private companies to harness easily available commercial technology to monitor citizens. PAIS, for instance, was created by Gurgaon-based Staqu, a software company which began as an AI image recognition service for e-commerce companies, before branching out into law enforcement.
Resource-strapped investigators say these tools have made it easier to track dangerous suspects, make connections between gang members and control crime. The convicts don't have to consent to their image being used, as India has no laws to prevent this from being done, although the police can't take photos of undertrials.
Punjab is not the only state using such technology; Rajasthan, Maharashtra, Bihar and Telangana are all looking to implement similar systems, and Andhra Pradesh has networked multiple departments to create a state-wide surveillance system.
The Crime and Criminal Tracking Network & Systems (CCTNS), a union-government funded programme, is compiling a nationwide biometric database of criminals that the government hopes to integrate with the Aadhaar database. In July this year, the Union cabinet also proposed a bill that allows for the creation of a national DNA database.
Yet, much like the controversial Aadhaar programme, so-called "smart policing" is on the rise with little public scrutiny of what this means for the rights and liberties of citizens. Experts worry that policing technology such as face recognition, which has serious implications for the privacy and freedom of citizens, remains largely untested.
"As these tools are proprietary, we can never have access to the source code. There are no impact assessment studies," said Mishi Choudhary, managing partner, Mishi Choudhary & Associates, a law firm specialising in technology law. "We or the police have no way to assess the risks of biased data or faulty prediction systems that these may be. I see a feedback loop where police keeps going back to the same people."
The consequences of handing such technology to India's notoriously poorly trained policemen could be devastating, analysts say.
"Databasing is a very potent and dangerous exercise because without context, quantitative data can mean anything and everything," said Noopur Raval, a PhD candidate in Informatics at the University of California, Irvine, and formerly an affiliate at the Berkman-Klein Center for Internet and Society at Harvard University. "Datafication, or these dashboard fantasies, can draw a very wrong picture."
The heart of PAIS, Punjab Police's AI-based face recognition system, is an innocuous-looking app with an intuitive interface of a grid of orange icons on a yellow background with options like Face Search, Text Search and Gang Tree Search.
When confronted with a suspect, officers can snap a photograph with their smartphone and search it against a database compiled by uploading pictures of convicts housed in jails across Punjab. Since this began a little over a year ago, more than 100,000 records have been added to the system. Kishore, the policeman, said undertrials are not added to the system.
"What we did was put phones in the hands of policemen in the jails, rather than putting them in all stations. We purchased just about 30 phones and used them to snap pictures of all new people coming in, and also from the people who were already in jail," he explained. "So when they would take the photos, they would also enter the data on the phone right there, and most of it is menus and lists so you can quickly fill up the information without having to sit on a computer and type for a long time. We did this only for male prisoners, because we thought if male constables are taking photos of female prisoners, it could become controversial."
The results of PAIS' face-matching technology are sometimes surprising. Kishore started off by showing HuffPost India a photograph of a heavily bearded man wearing a turban, with less than a third of his face visible. In seconds, the system returned a set of around ten different photos.
The top match, with an 82% match score, was an image of a clean-shaven man with short hair, wearing a t-shirt.
At first glance, it's hard to believe the two images are of the same person, but after staring at the two photographs for long enough, one can see similarities in the nose, the shape of the lips, and the eyes.
"It's not always the top person suggested, this is only the start. Then you have to do investigation to figure out who the person really is, but in my experience, the correct person has always been one of the top four suggestions in the app," said Kishore.
The ambiguity of face-recognition technology, and its outsized claims, has prompted questions from civil rights experts.
In July, the American Civil Liberties Union compared photographs of members of the US Congress with 25,000 mug shots of criminals using Rekognition, a face recognition software Amazon is pitching to law enforcement agencies in the US. The software misidentified 28 of the politicians as criminals who had been jailed.
"It's not hard to imagine a police officer getting a 'match' indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins," the ACLU report said. "People of colour are already disproportionately harmed by police practices, and it's easy to see how Rekognition could exacerbate that."
The bias in AI systems comes from the training data. In the case of Staqu's software, the company used publicly available images to 'teach' the AI what a human face looks like. In India, it isn't hard to imagine a similar algorithmic bias towards minorities.
"Technology itself is a human product and as such, it is subject to the same social influences of all human behaviour," said Dr. Winifred R. Poster, a professor at Washington University, St. Louis, who has written on the subject. "Researchers at MIT have shown that facial recognition systems are very effective at reading the faces of white people, but they are egregiously poor at reading black faces, with error rates that increase exponentially."
Aside from the problem of accuracy, there is also the question about whether laws are keeping up with technology. This is particularly true as technology gets sourced from around the world.
"The transnational circulation of knowledge professionals in the surveillance industry is a growing trend. Indian tech entrepreneurs have been designing and marketing surveillance systems for prisons that are sold in the US," she said. Such companies have databases of the prisoners, with videos and photos, as well as biometric data and analysis. "Because such technology is being developed across state lines, regulation and oversight should be international as well," Dr. Poster said.
Meet the makers
Staqu, the company that built PAIS, has contested claims that it can lead to mistakes in law enforcement.
"Earlier, to recognise a face, the algorithm was looking at specific points, and making a wireframe of the face, which was less accurate," said Atul Rai, co-founder and CEO of StaqU. "Today, we're able to make a much more sophisticated model, that takes into account changes like weight gain or loss, age and occlusion, so even if you grow a beard, it will work."
Staqu was launched as an AI solutions company that focused on computer vision. While its early clients were in e-commerce, it was looking to operate in any niches where computer vision could be deployed. Rai first pitched the idea of using his company's technology to find missing persons, but soon discovered that a niche existed in the security space, where Staqu has been growing.
Today, Staqu is also working with the Dubai police for a predictive model that includes real-time data, with a goal of 25% crime reduction by 2021. It was chosen from among a slate of international companies, out of nearly 700 applicants.
It's not the only one either—there are several private sector companies offering similar solutions. FaceOrbit has been working more closely with the private sector. Like Staqu, it didn't start as a security company—its speciality was the transmission of video over the Internet. This made it an ideal service provider for CCTVs, and then its creators decided to work with artificial intelligence and machine learning to provide the "brain of a camera".
"Wherever you get video from, we can analyse it and help you take the appropriate action. We can do gesture analysis, face recognition, weapon detection, so if there's any security issue, it can be flagged automatically," said Sanjay Sinha, founder and chief mentor of FaceOrbit. Some malls are also interested in the technology, and deployment has begun. Another company in this space is Facetagr, based out of Chennai.
This technology can also be used to identify images from outdoor CCTV footage, which raises the prospect of the police filming public gatherings and using the technology to profile people protesting against government policies.
In one case, it was used on footage from a CCTV camera in Punjab. In the video, two men are standing by a scooter having an argument. Two other men walk up to them, and then casually start shooting. The first two men drop instantly—it's like a movie, though with less blood. One of the shooters looks around for a second, and his face points towards the CCTV camera for a split-second. This blurry image was enough to make a match, sources said.
The most comprehensive example of AI-powered surveillance, though, is in China. There, a national surveillance system is coming up, used by everything from hotels, airports, banks, and of course, the police.
The Aadhaar Issue
Policemen like Kishore say fears that deploying such technology will result in the creation of a police state are overblown. PAIS has made a huge difference in Punjab, he said, where sharing information between police stations was a huge problem—a difficulty accentuated by the constant movement of suspects from district to district.
At the same time, a senior law enforcement official told HuffPost India on background that AI-based policing would eventually be integrated with Aadhaar's vast citizen database.
"I believe that Aadhaar access will have to be given to the police," he said. "Right now, only people who have been jailed are being tracked. If you give the police more data, it will only benefit people."
The government has already spent crores on gathering Aadhaar data, the policeman said. "Why would you want to waste money to duplicate the effort?"
"Laws and regulations need to catch up to the pace of technology growth," said Sharmila Nair, a legal consultant working on information technology and the scope of AI laws.
"The regulated use of predictive policing would help in reducing crime rates," Nair said. "But simultaneously the aim should be to uphold the fundamental right to privacy."