Tag: Technozion

Automated surveillance technology using drones to spot problematic human behavior in crowds is going to be tested at Technozion and Spring Spree festivals at NIT Warangal, reports the Verge. Lead researcher Amarjot Singh of the University of Cambridge claimed that their system has 94% accuracy at identifying violent poses. However, this accuracy drops with more people in the frame (like there would be at a festival), for example, to 79% with 10 people in the frame.

Police surveillance is growing without much scrutiny in recent years. The laws governing such surveillance have grey areas in which a lot of video surveillance technology currently operates. Reported applications include face recognition technology, behavior recognition, as in the case of the surveillance drones reported in The Verge, facial recognition and linking with police records, including tagging personal information with Aadhaar and sharing it across states.

An increasing number of cities have police using various kinds of surveillance databases to get better information on suspects and potential criminals in the city. These databases, where individual policemen can add the information of people to have some disturbing implications. There are several cities using Facial Recognition Softwares to assist policemen keep track of criminals.

Surveillance of everyone, not just criminals or suspects

There are several cities where CCTV camera networks scan everyone on the street and match their faces against a database of suspects and criminals. Here is a partial list:

In 2015, Surat became the first city in India to deploy real time surveillance through facial recognition systems when they implemented NEC India's FaceWatch in collaboration with Innovative Telecom & Softwares. The system uses live feeds from a growing network of CCTV cameras and can be used to monitor for crime in real time. It is capable of facial recognition as well as Automatic number plate recognition. Also, "It automatically matches faces against a database of 30,000 criminal mugshots and can alert the police immediately of anyone on a watchlist."By August, Surat had 604 cameras in 114 locations, covering 10% of the city with plans to add another 900 cameras in a year and bring the total to 2,500 in two years.

In 2016, Mumbai got 4,617 CCTV cameras hooked to the RTO control room and backed by 1000 vehicles fitted with GPS in order to coordinate with the control room were made operational with the objective of tackling law and order, fighting and preventing crime, regulating traffic and detecting traffic-related offences. These cameras are also capable of Automatic Number Plate Recognition as well as Facial Recognition. Additional Chief Secretary (Home) K P Bakshi told the Indian Express, "We can search for an individual all over the city. The cameras will identify the face of a wanted criminal. The camera will also pick out faces of persons roaming around continuously in one place. The nearest police van will then be alerted about the person’s location."

In 2016, 160 CCTV cameras were installed in Visakhapatanam as a part of a hi-tech surveillance network.

In 2017, Jaipur police trialed a facial recognition system with cameras installed outside the Ganesh temple at Modi Doongri and controlled from the command and control centre called "Abhay". The FRS would scan the people before it and match them against a database of serial offenders and suspects.

In 2018, Cameras with Facial Recognition Technology are expected to be in use in local trains on the Central line in Mumbai, by the end of the year 2018, at a total cost of 276 crore. The cameras "will store facial details of commuters (for 10 days). The cameras with facial recognition software would help trace past movements of any offender on a local train and arrest the person when he travels next." A total of 11,160 cameras will be procured - 76 cameras for each rake, with at least 6 cameras in each coach of the rake.

In 2018, Hyderabad city police are matching the faces of everyone on the city's streets against a database of one lakh criminas, from the control room at the Facial Recognition Analytics unit at the Commissioner’s office at Basheerbagh. IT Cell incharge, K. Sreenath Reddy said that the local police are alerted only when the resemblance is more than 70 per cent.

Thiruvananthapuram police are using 233 cameras in their surveillance network of the city.

Paradip in Odisha is to get a CCTV surveillance camera network within a month.

Retired ACP Dhoble (of the hockey stick wielding moral police fame) is now in the process of getting a facial recognition software for the city and believes it needs to be created with the "help" of his son Kshitij, who specialized in Artificial Intelligence at Aukland University. An effort that initially began with a goal of tracing missing people has expanded its objective to "tracking criminals" as well. "Meanwhile, they began compiling the information of all 15,847 police stations in India and uploaded it on the site. One aspect of the site is uploading the information of these police and stations. The other is to spot child beggars, labourers and send it to the site."

Police database for use with mobile app -FaceTagr

This is a database of criminal records that can be used with a Facial Recognition Software (FaceTagr) installed on Android mobile phones of beat policemen and inspectors working in the field. When a policeman scans a suspect's face, the mobile app returns data of police cases filed and police station limits for the criminal the face matches with. Databases being expandable, the database has the potential to store the records of criminals across the country.

The application that was originally built by Vijay Gnanadesikan, CEO of Haliscape Business Solutions, to help rescue children by matching records of missing and found children, was first trialled for police use in Chennal

In 2017, FACETAGR was adopted by T Nagar police station of Chennai, beginning with a database of 12,000 criminals. An additional 40,000 suspects were added to the app to improve the chances of police identifying faces. The app used by policemen to "scan" suspects. Once a suspect is scanned, the app returns information about them.

In 2018, Chennai police will expand the use of FACETAGR to include interstate criminals as well by expanding the data used by the application to other Southern states. Currently the database has information on 67,000 criminals, including information sent by the Pudducherry Crime Records Bureau. It is awaiting data from Andhra Pradesh, Telangana, Kerala and Karnataka. The application is in use in 10 out of 12 police districts and is installed on the phones of beat constables. 18 inspectors, subinspecotrs and 150 beat police of Washermanpet were the latest to get the app, with "700 criminals in A, A plus, B and C categories".

e-Petty

The e-Petty app is being used across Telangana state to book cases in minor crimes under Sections pertaining to IPC, City Police Act, Gaming Act/ COTPA Act 2003, Motor Vehicle Act and Town Nuisance Act. The app can record photographic and video evidence from the crime scene, photographs of suspects and generate an automatic chargesheet based on evidence. The app tracks previous cases of individuals as well and identify repeat violators because the app links profiles online with Aadhaar card numbers.

Hyderabad

Hyderabad is probably the most surveilled city in the country. The Integrated People Information Hub pulls data from dozens of sources to create profiles of individuals that include not just their own comprehensive information, but that of parents as well. It is a data hoarding machine gone rogue, where there appears to be no reason or reasonable suspicion required to put citizens under surveillance. The surveillance includes call records, social media, relatives and friends, utilities and more.

Questions raised

The use of aggregated databases and Artificial Intelligence in large scale applications is new in India and the laws don't yet have necessary support as well as restrictions on implementation. There is no doubt that information is power and information on suspects and criminals empowers police to do their jobs better. The lack of development of proper laws, policies, protocols and facilities for the police to record and access information in a secure manner has led to the adoption of various technologies in an ad hoc manner with little overisght.

However, largescale use of such applications raise several and serious questions:

Is it constitutional to treat every person as a potential criminal? When all the people entering the range of a Facial Recognition enabled camera are scanned and matched against databases of criminals, it amounts to intrusive surveillance. India lacks a data protection law or a law defining the contours of privacy, however the recent robust arguments against surveillance and observations by judges in the Constitutional Challenge to Aadhaar are very clear that Indians do have a right to privacy and surveillance violates this right.

Data ownership: FaceTagr is owned by Haliscape Business Solutiosn Pvt Ltd of Chennai. NEC is a global organization. It is unclear who owns or protects the data on these databases and what restrictions exist against its misuse.

Data access: Cortica, a foreign AI company has formed a partnership with the Best Group to analyze CCTV footage from public cameras to predict crime. While technologically it may be a challenging goal, a foreign company with considerable ties to foreign intelligence has capabilities and access to individuals on Indian streets. The software is capable of using data from not just video cameras but satellite and drone footage as well and is capable of analyzing human behavior, including differentiating between nature of crowds - routine market corwd or a protest, etc.In the case of Mumbai, a company run by a software professional and a retired police official appears to have access to information from all police stations in India and are proceeding to build a database! It is unclear how and why a software under development by private individuals has access to nationwide sensitive data.

A market of the gullible: The lack of proper evaluation or policies requiring specific standards has left the police of India a ripe target for companies selling surveillance products who may exploit the real need for collecting information or corrupt insiders to gain contracts. Many of the technologies described here have not been subjected to robust testing and have no published research about their quality. Some of the stories describe extensive installations that become defunct or are not of adequate quality to begin with, as in the case of Visakhapatanam, left with 3 working cameras out of 160 within 2 years of installation at massive public expense. Others describe extremely efficient systems, but ones that violate the rights of the citizens they are supposed to serve.This risks spending public funds for purposes and methods that may not be in public interest. There is an urgent need to consult with independent experts and digital rights law researchers and other professionals without conflict of interest to put together guidelines for data collection for surveillance, data destruction when its purpose is served, securing of that data to prevent misuse and policies on who should have access and a transparent process for granting such access.

Who is a criminal or suspect: It doesn't take a lot for police to consider someone a suspect and there is little oversight. There is no warrant or independent authority required to initiate surveillance against anyone. Such a database has the capacity to take the local prejudices of police across state lines and cause considerable harassment to individuals in all areas covered by such databases.

Utility: While there is obviously a need for police to monitor suspects in order to gather evidence, the legality and utility of randomly spotting them on the street is debatable. What is the utility of someone say.... suspected of having conducted a robbery... being spotted in another state - if it even is the same person?

Technological limitations: Such "identification" is inherently probabilistic and can be wrong. A good example would be the Welsh police wrongly identifying over two thousand people as potential criminals when they used Facial Recognition at the 2017 Champions League final in Cardiff in a crowd of 170,000 spectators. This has the potential to create a lot of harassment as well as waste police resources when applied to the far bigger numbers of people on the street in Indian cities.

Bypassing consent: A person suspected by the police and asked to come for questioning has rights. They can agree or refuse and the police cannot actually force them to say.... stand in a line up to be identified without any due process. Or they may wish to have a lawyer present when interacting with a policeman as a suspect. However, use of software such as this allows a beat constable to completely arbitrarily scan people who may not even realize that they are actually in a situation with the law where they may need to exert choices to protect their interests.

Human rights: As often happens when the state adopts technology, the advantages of the technology have been understood and promoted, but there appears to have been little consideration given to human rights implications of falsely accused individuals, potential for corruption through entering or removing entries on the database for bribes or blackmail, consequences of false positives to innocents and other potential fallout. There needs to be better consultation by the state when adopting such technologies with professionals (other than those providing the technology as a solution) to assess the wider impact beyond the immediate problem the technology aims to solve and mitigate the potential for harm.

Ability to maintain technology: Out of 160 cameras installed in Visakhapatanam 2016, 3 cameras were working in 2018. One of them being pointed to the ground, was useless.

Aggregated or discrete databases? It is not known whether the databases used to identify criminals through CCTV or the FaceTagr app or e-Petty are linked where they coexist. Aggregation of data across these databases has even more potential for the violation of rights of citizens.

Magnifying social prejudices: A simple statistical reality is that positives - whether real or false - will be higher among those who get scanned more. In a country where there is considerable documented evidence of prejudice against religious minorities or underprivileged castes, classes and communities, the use of such a software has the potential to magnify and endorse prejudices that cause their targeting. Take for example, reported cases of slums being raided and all the men in them being asked to identify themselves. The chances of these men being identified - correctly or falsely - will always be higher than say a person living in a gated society, where such raids are unheard of, simply because such faces will get scanned more often than those whose circumstances don't lend easily to such situations.

Use of Aadhaar for profiling: the e-Petty app used in Telangana is a clear use of Aadhaar for profiling - something the government has consistently denied in the Supreme Court.

Lack of appropriate digital security: Apart from the data being shared across state borders, or being hosted on private servers or foreign companies being given access to it - which are issues of policy to determine what is appropriate and what is not, there are outright failures of digital security, which result in unintended and unauthorized access to the very sensitive data being collected. Researcher Kodali, for example, had pointed out that the Hyderabad police were using a third party portal to record and geotag crime. The portal having very poor security for the purpose it was being used for, had allowed the indexing of crime reports by search engines for years, including the names of rape victims - which is not legal in India.

Lack of independent audit or testing: The systems used for both largescale CCTV surveillance as well as scanning individuals using a mobile app do not have information available on their accuracy. The lesser the accuracy, the more such systems will end up wasting police resources on chasing dead ends and causing harassing citizens.

A need for legislation: It is undeniable that the police need effective ways to access databases to find information on suspects and criminals on the fly. It is also inevitable that this will involve a certain degree of invasion of privacy in the interests of conducting investigations. However, this cannot simply be left to whatever software developers believe can be done or police wish to adopt. There needs to be a regulatory framework that will identify situations when such use is legitimate and protect citizens from arbitrarily being entered into databases as suspects. There should also be regulation of what information should remain local and what should be disseminated - a local suspected of robbery does not need to be found acorss state borders, but an absconding criminal found in the footage of a murder should be. There is also a need for legislation to remove names from the databases when the people are no longer suspects - for example cases people were suspected in get closed with others charged.

Further reading:

Research published by the Center on Privacy and Technology at Georgetown Law, "The Perpetual Line-Up" on the unregulated use of public surveillance by law enforcement and the risks.

Technological bias: While MediaNama was not able to find any research about FaceTagr specifically, "Face Recognition Performance: Role of Demographic Information" by the FBI about accuracy of Facial Recognition in various population demographics is an interesting read on the biases caused by how the system is "trained" to recognize faces.