La Quadrature du Nethttps://www.laquadrature.net
Mon, 18 May 2020 13:21:10 +0000en-GB
hourly
1 https://wordpress.org/?v=5.4.1https://www.laquadrature.net/wp-content/uploads/sites/8/2018/09/LOGO-LQDN3-36x36.pngLa Quadrature du Nethttps://www.laquadrature.net
3232First victory against French police droneshttps://www.laquadrature.net/en/2020/05/18/first-victory-against-french-police-drones/
https://www.laquadrature.net/en/2020/05/18/first-victory-against-french-police-drones/#respondMon, 18 May 2020 13:21:10 +0000https://www.laquadrature.net/?p=15922The Conseil d’État, the French administrative highest court, has just issued its decision on our case against surveillance drones deployed by the Parisian Police during the Covid lock-down. This decision is a major victory against drone surveillance. It sets as illegal any drone equipped with camera and flying low enough, as such a drone would allow the police to detect individuals by their clothing or a distinctive sign.

According to the Conseil d’Etat, only a ministerial decree reviewed by the CNIL could allow the police to use such drones. As long as such a decree has not been issued, the French police will not be able to use its drones anymore. Indeed, today’s decision is all about the Covid health crisis, a much more important purpose than those usually pursued by the police to deploy drones.

Other devices from the Technopolice are still being used without a legal framework: automated CCTV, sound sensors, predictive police… This decision gives us the will to continue with our fights.

]]>https://www.laquadrature.net/en/2020/05/18/first-victory-against-french-police-drones/feed/0Our Arguments Against the French Contact-Tracing App StopCovidhttps://www.laquadrature.net/en/2020/04/23/our-arguments-against-the-french-contact-tracing-app-stopcovid/
https://www.laquadrature.net/en/2020/04/23/our-arguments-against-the-french-contact-tracing-app-stopcovid/#respondThu, 23 Apr 2020 15:58:55 +0000https://www.laquadrature.net/?p=15807Last week, French President Emmanuel Macron invited the French Parliament to discuss the potential use of StopCovid, the contact-tracing application that his government is developing. We just sent out to the Members of the Parliament the following summary of arguments regarding this application.

The StopCovid application would be useless, could endanger our civil liberties and could even worsen the health crisis.The French administration and Parliament must stop investing human or economic resources in this vain and dangerous project. The real emergency lies everywhere but here.

Dubious Effectiveness

Low Adoption Rate

According to first approximations, at least 60%, but more probably 80% to 100% of the population would need to use the app in order for it to have a significant effect, and this assuming the app produces reliable data ;

Only 77% of the French population owns a smartphone, and this number drops down to 44% for those over 70 years old, whereas they are among the most vulnerable;

Many people do not know how to turn on Bluetooth, and many refuse to have it always turned on for practical reasons (saving battery) or security reasons1Many of the smartphones in use are not equipped with the latest security updates, and flaws in the Bluetooth protocol have been discovered over the last few years.;

In Singapore, around 16% of the population used a similar app. This didn’t prevent an eventual lockdown.

Imprecise Rresults

it is unlikely that tests will be made as readily available as they would need to be, preventing people from reliably reporting their true condition. Relying on people self-diagnosing runs the risk of producing very high numbers of false-positives.

There seems to be no consensus as to the duration and the distance of contact that justifies warning a person who came “in “contact” with another contaminated person.

In some very densely populated areas (some neighborhoods, shopping malls, large companies), one would see an explosion of false positives, which would effectively render the app useless.

Blueetooth range seems to vary widely from a device to another, and precision is not necessarily good enough to provide reliable results2See the’analysis of ACLU : “Other open questions include whether Bluetooth is precise enough to distinguish close contacts given that its range, while typically around 10 meters, can in theory reach up to 400 meters, and that its signal strength varies widely by chips et, battery, and antenna design” .

Counter-Productive for Public Health

By creating a false sense of security regarding public health, the app could lead people to reduce the basic protective behaviour, while failing to reliably notify people.

The energy and costs required to develop the app would not be invested in more effective solutions such as producing more masks, ramping up testing or promoting basic protective behaviour.

Implementing surveillance systems would increase the already strong feelings of defiance in part of the population towards the State. Without confidence in the system, potentially ill people could be incited to hide their symptoms from medical services fear of negative consequences.

Liberties Sacrificed in Vain

Discrimination

Whether by making it mandatory or through excessive social pressure, people deciding not to use the app could risk losing their jobs or be denied access to public spaces (cf. case in Italy (Italian: https://www.laprovinciacr.it/news/italia-e-mondo/246504/coronavirus-ferrari-app-tracciamento-per-dipendenti.html)), making their consent non-free and thus null and void.

One of the worst possibilities of discrimination would be to facilitate access to testing to those who use the application.

Surveillance

If the app were to be used by part of the population, one can fear that the government would then be able to more easily impose its usage to the remainder of the population, against its will. It is noteworthy that none of the securitian and freedom-destroying measures taken in times of “urgency” are ever revoked: this ratchet effect plays strongly in the justified defiance towards these control measures.

The goal of the app (warning targeted people) is by essence incompatible with the legal concept of anonymity. At best, it is pseudonymity, which does not protect against the risk of individual surveillance.

Publically releasing the application’s code under a free and open source license as well as using reproducible compilation methods would be the mininum requirements against abuse, but non-sufficient in and of itself.

Normalisation of Surveillance

No one can predict how long the app would be deployed for.

Once deployed, it will be easier for the government to add coercive funcitonality to the app (control and enforcement of quarantine at the individual level).

The app pushes us to submit our bodies to constant surveillance, which gradually reinforces the social acceptability of other technologies, such as facial recognition, or automatic videosurveillance, which are currently largely rejected.

Technological solutionism: the app reinforces the blind belief that technology and surveillance can be the main answers to health, environmental or economic crises while on the contrary, they draw attention away from the solutions: scientific research, adequate financing of public services…

Deploying an app whose objectives, technology and usage carry significant risks for our society and our freedoms, for likely mediocre results (possibly even counter-productive ones) is not something we can consider acceptable – nor is it for many French people. The media and political time, but also the budget allocated to this, would be better spent on informing and protecting people (and health workers) with proven methods such as providing more safety masks, medical equipment and blood tests.

Many of the smartphones in use are not equipped with the latest security updates, and flaws in the Bluetooth protocol have been discovered over the last few years.

2.

↑

See the’analysis of ACLU : “Other open questions include whether Bluetooth is precise enough to distinguish close contacts given that its range, while typically around 10 meters, can in theory reach up to 400 meters, and that its signal strength varies widely by chips et, battery, and antenna design”

]]>https://www.laquadrature.net/en/2020/04/23/our-arguments-against-the-french-contact-tracing-app-stopcovid/feed/0Orange recycles its geolocation service for the global pandemichttps://www.laquadrature.net/en/2020/03/31/orange-recycles-its-geolocation-service-for-the-global-pandemic/
https://www.laquadrature.net/en/2020/03/31/orange-recycles-its-geolocation-service-for-the-global-pandemic/#respondTue, 31 Mar 2020 09:46:16 +0000https://www.laquadrature.net/?p=15684For years, Orange has been trying to market the gold mine that is our geolocation data (the list of relay antennas to which our phones connect during the day). The pandemic appears to be a good opportunity for the company to open its market.

Flux Vision

In 2013, Orange launched its first product, Flux Vision, providing cities and tourist destinations with statistics on the “travel flows” of their visitors: number of visitors, length of stay, origin, routes travelled. The provided statistics are anonymous, but Orange produces them in a more or less legal manner.

Measuring the number of visitors on a location is as simple as counting the number of connections to a relay antenna, without processing any personal data. Alright. However, in order to evaluate the length of stay, origin or route, Orange has to process non-anonymous data that reveals the position of each visitor at different times during his or her stay. In practice, it is no longer just a question of counting the number of connections to a given antenna, but also of looking at the identifier of each visitor1During the 2016 Féria de Béziers, Orange revealed that a significant number of visitors came from Toulouse, allowing the city to better target its next advertising campaign (see testimonial). The company also tracked the position of people around the Féria at different times of the day, revealing for example that the people who usually lived there waited until the last days of the festivities to return home (see the graph illustrating this article). This information can only be produced by analysing the location data for each person. It does not matter that this data is then anonymised if, prior to such anonymisation, data have been collected, processed and categorised for a purpose unrelated to the service the operator originally provides to its subscribers..

The ePrivacy Directive and French law prohibit the processing of non-anonymous location data without one’s consent. Within Flux Vision framework, Orange never asks for this consent. For reasons that are still unclear2We can point out that this is unfortunately not an isolated case, which could explain why the CNIL tolerates Flux Vision. In Article 5 of its 2019 guidelines on the use of online trackers, the CNIL has, here again, created an exception to justify collecting data without consent. Again, this exception concerns the analysis of visitors (to websites) and authorizes the filing and retrieval of cookies on our computer or smartphone for the “production of anonymous statistics”. This exception violates both Article 5, §3 of the ePrivacy Directive and Article 82 of the 1978 Data Protection Act. These two texts are perfectly explicit: by law, no one is allowed to access your computer for something you did not explicitly ask for. Whatever the CNIL may say, no economic motive justifies infringing the inviolability of your computer equipment or your home., and without any legal basis, the CNIL tolerates that mobile operators violate the law “in the field of tourism, land use planning and road traffic”. In 2013, Orange was able to take advantage of this situation but, caught between illegality and the CNIL’s tolerance, the company has not pushed any new offer for 7 years.

A health crisis and a failing government have created a great opportunity for new strategies to bloom out and a new product to replace Flux Vision.

The opportunity of the crisis

European Commissioner Thierry Breton also saw an opportunity to help the industry that fed him: he brought together the eight main European operators (i.e. Orange, Deutsche Telekom, Vodafone…) to announce among non-medical ingeneers without any pandemic experts, their strategy to fight the pandemic by population monitoring. Enough to highlight their commercial offers.

In France, Orange CEO Stéphane Richard is all over medias with a quite clear strategy: to recycle its Flux Vision 2013 offer for the current global crisis. If Orange can already inform cities about how tourists are moving in and out, it surely can be extend to infected and confined people. And if Orange plays well in times of crisis, it will have opened up a new sustainable market. It will even have moved closer to other similar markets, which are still not very reputable, whether it be to track demonstrators, young people in poor neighbourhoods, the homeless…

A great opportunity to diversify in security.

The support of the CNIL

And what does the CNIL do? Mediapart revealed that the CNIL is pushing the government towards comparable solutions which, in practice, are mainly those of Orange.

To justify itself, the CNIL uses the spurious vocabulary of Orange, which boasts of providing “aggregated” statistics to give a feel they are complying with the law. However, in order to provide “anonymous” travel statistics, Orange first analyses personal, non-anonymous data without the consent of individuals. This is illegal.

The CNIL should have required that no Orange statistics could be based on anything other than purely technical data, unrelated to people, such as the number of connections to base stations. For example, although it is not clear how Paris city estimated the 17% drop in its population since confinement, the city could simply have compared the number of connections to its antennas between two dates, demonstrating that it is not necessary to break the law to produce figures.

Further monitoring

Unfortunately, the CNIL does not only promote Orange’s commercial offers. It also invites the government to adopt a new legislation in case that “more advanced” measures are needed – e.g. mapping every patient or confined person without their consent. However, the ePrivacy Directive prohibits any such legislation: location data can only be collected without people’s consent to fight crime (and only the most serious crimes, according to EU judges) and not to fight the spread of a virus3Under Article 15 of the ePrivacy Directive, States may require operators to process location data without one’s consent if “national security” or “public security” justify it. “National security” is defined in Article 4, §2, of the Treaty on EU as covering areas where the Union is not competent to act. Article 168 of the Treaty on the Functioning of the EU states that the EU is competent to adress diseases issues and make them fall outside the scope of “national security”. Otherwise, Thierry Breton and the Commission would not have authority to combat the coronavirus on the territory of the Member States, as is currently the case. “Public security” is described in Article 1 of Directive 2016/680 as an area “included” in the fight against criminal offences. The Court of Justice of the European Union is even stricter, specifying that “public security” only justifies surveillance of individuals related to “a serious crime” (Tele2 judgment of 21 December 2016, point 106). Fighting the virus does not consist in fighting “serious crimes” and is therefore excluded from the notion of “public security”.. Contrary to what one may read in the press, the GDPR is not in a position to authorise processing of location data. Only the ePrivacy Directive could do so but prohibits it in this case.

We would like to believe that, if the CNIL is calling on the government to violate European law, it is not just to restore the greatness of the country’s industry, but also to protect our health. Except that neither the CNIL, nor Orange, nor anyone else has been able to demonstrate the medical necessity of monitoring confined or sick people without their agreement – especially when they are undetectable in the absence of a test. While Singapore is suggesting an application based on an open protocol allowing people to voluntarily reveal their movements, why is the CNIL defending Orange’s proposal, which represents a law violation, much less respectful of our freedoms and which, for its part, has not demonstrated any efficacity against the virus?

For now, the government seems too busy with other things to respond to Orange’s call. Unlike the CNIL, we will not hesitate to attack it if it yields to the risky ambitions of crisis profiteers.

During the 2016 Féria de Béziers, Orange revealed that a significant number of visitors came from Toulouse, allowing the city to better target its next advertising campaign (see testimonial). The company also tracked the position of people around the Féria at different times of the day, revealing for example that the people who usually lived there waited until the last days of the festivities to return home (see the graph illustrating this article). This information can only be produced by analysing the location data for each person. It does not matter that this data is then anonymised if, prior to such anonymisation, data have been collected, processed and categorised for a purpose unrelated to the service the operator originally provides to its subscribers.

2.

↑

We can point out that this is unfortunately not an isolated case, which could explain why the CNIL tolerates Flux Vision. In Article 5 of its 2019 guidelines on the use of online trackers, the CNIL has, here again, created an exception to justify collecting data without consent. Again, this exception concerns the analysis of visitors (to websites) and authorizes the filing and retrieval of cookies on our computer or smartphone for the “production of anonymous statistics”. This exception violates both Article 5, §3 of the ePrivacy Directive and Article 82 of the 1978 Data Protection Act. These two texts are perfectly explicit: by law, no one is allowed to access your computer for something you did not explicitly ask for. Whatever the CNIL may say, no economic motive justifies infringing the inviolability of your computer equipment or your home.

3.

↑

Under Article 15 of the ePrivacy Directive, States may require operators to process location data without one’s consent if “national security” or “public security” justify it. “National security” is defined in Article 4, §2, of the Treaty on EU as covering areas where the Union is not competent to act. Article 168 of the Treaty on the Functioning of the EU states that the EU is competent to adress diseases issues and make them fall outside the scope of “national security”. Otherwise, Thierry Breton and the Commission would not have authority to combat the coronavirus on the territory of the Member States, as is currently the case. “Public security” is described in Article 1 of Directive 2016/680 as an area “included” in the fight against criminal offences. The Court of Justice of the European Union is even stricter, specifying that “public security” only justifies surveillance of individuals related to “a serious crime” (Tele2 judgment of 21 December 2016, point 106). Fighting the virus does not consist in fighting “serious crimes” and is therefore excluded from the notion of “public security”.

]]>https://www.laquadrature.net/en/2020/03/31/orange-recycles-its-geolocation-service-for-the-global-pandemic/feed/0First Success Against Facial Recognition in Francehttps://www.laquadrature.net/en/2020/02/27/first-success-against-facial-recognition/
https://www.laquadrature.net/en/2020/02/27/first-success-against-facial-recognition/#respondThu, 27 Feb 2020 15:56:30 +0000https://www.laquadrature.net/?p=15612Earlier this month, the Administrative Court of Marseille heard our case against facial recognition systems controlling access to two high schools in Nice and Marseille. These systems were authorised in December by the PACA Region as “experimental”. Yesterday, the Court annulled this decision.

The Court found that the Region had no power to take this decision – schools only have such powers. Furthermore, the Court found that it breached the GDPR: these systems were based on “consent”, but students’ consent cannot be “freely given” because of the authority relationship that binds them to the school’s administration.

Finally, the Administrative Court found, just as the CNIL already underlined last October, that facial recognition is a disproportionate measure to control access to high schools. Moreover, alternative measures are way more less infringing to people’s rights. Incidentally, a magistrate stated during our hearing that “the Region is using a hammer to hit an ant”.

In France, this is the first court decision about facial recognition, and the first success against it! We hope it will be followed by a series of other successes leading to the total ban of facial recognition. In December, we published a common letter along with 124 organisations calling for a ban on “any present and future use of facial recognition for security and surveillance purposes”.

Next Monday, we’ll go before the Administrative Court of Marseille, again, to be heard on our case against “smart” video surveillance systems (which target, amongst other, “anormous” behaviours in public space).

Promoting freedom is priceless but has a cost. You’re welcome to donate if you can!

]]>https://www.laquadrature.net/en/2020/02/27/first-success-against-facial-recognition/feed/0Technopolice: Resisting the Total Surveillance of Our Cities and of Our Liveshttps://www.laquadrature.net/en/2020/02/04/technopolice-resisting-the-total-surveillance-of-our-cities-and-of-our-lives/
https://www.laquadrature.net/en/2020/02/04/technopolice-resisting-the-total-surveillance-of-our-cities-and-of-our-lives/#respondTue, 04 Feb 2020 13:40:07 +0000https://www.laquadrature.net/?p=15553In September 2019, dozens of human rights organizations launched Technopolice.fr, a participatory campaign to document the spread of so-called “Safe City” projects across France, and resist the proliferation of automated video-surveillance and predictive policing technologies. Here is the Technopolice Manifesto.

Throughout France, Smart Cities are showing their real nature: the total surveillance of urban public spaces for law enforcement purposes.

In Toulouse, Valenciennes, Strasbourg or Paris, local police forces are experimenting videosurveillance technologies said to be “intelligent” because they are based on automated processing of video streams, enabling features such as facial recognition. In Saint-Étienne, a startup teamed up with local authorities to deploy “intelligent” microphones in low-income areas, and alert the police in case of suspicious noise. A similar development is underway in Paris to monitor noise level around bars and cafés.

In Marseille and Nice, defense and utility contractors such as Thales and Engie are working hand-in-hand with local officials to push their “Safe City” projects. Their applications range from the recognition of emotions in urban public spaces to the massive interconnection of databases for predictive policing purposes, but also the monitoring of online social networks. Computing technologies such as Big Data and Artificial Intelligence are the keystones of these various projects. They are the core building block for making sense of all the data that can be produced or collected, for establishing correlations, making statistical cross-checks, tracking individuals or managing places and services.

The so-called Smart City is turning our future into the Technopolice: Under the guise of optimization and decision support, they transform the whole urban world into a vast surveillance program. First, a large-scale surveillance dedicated to real-time control of flows of people and goods through centralized management, implemented from a hyperconnected command center. Then, a targeted surveillance of individuals and groups: as soon as “suspicious” behavior is detected, police apparatus can be unleashed to “neutralize the threat” and suppress the smallest “breach of the peace.” Or, conversely, reward citizens deemed virtuous by the State.

But we just have to look to the mirror of history or to other parts of the world to understand where the Technopolice is leading us: It will reinforce forms of discrimination and segregation, muzzle social movements, depoliticize public spaces, automate the police and denials of justice, while further dehumanizing social relations. All this and more at huge financial and ecological costs, since it will take taxpayers’ money, rare earths, plenty of electricity and many other resources to build and run these infrastructures.

Apart from a few seemingly consensual applications, the Smart City will mainly be used to reinforce the power of merchants of fear, and hide as long as possible the ineptitude of their policies. Technocrats rely on the Plan and the Machine to regulate our cities and our lives. Instead of the polis understood as a democratic city, as a pluralistic space of wandering, of impromptu meetings and confrontation with otherness, they want to bleed the city dry. The Technopolice looks like a gigantic test tube where the most advanced forms of social control are being developed.

Against this dystopia put forward by those who pretend to govern us, we call for unyielding resistance, in France and beyond.

Our open letter (also in PDF remains open for signature from organisations and companies (individuals are strongly encouraged to spread it widely). To sign it, please write to us at contact@laquadrature.net, with the email subject “Signing facial recognition open letter”, and giving us the name of your organization in the email. Thank you!

Joint letter: Ban Security and Surveillance Facial Recognition

We, organisations, collectives, companies, associations and trade unions, ask the French Parliament and government to ban any use of facial recognition techniques for security and surveillance purposes, now and in the future.

We note that such technologies are already being widely deployed in France. In addition to the “PARAFE” automated border gates already installed in various stations and airports in France, since 2012 and the creation of the prior criminal records database (the Traitement des antécédents judiciaires file), civil and military police can run facial recognition techniques on images captured on the street by surveillance cameras, or taken from social media. Other experiments are already underway or planned.

Yet, many public and private actors are not satisfied with the multitude of systems already installed, outside of any real legal framework, without transparency or public discussion, and want to go further. Gounded in the fantasy of unavoidable technical development and pushing narrow economic and security-mongering arguments, they want to speed up and simplify deployment of these systems, regardless of the possible consequences for our freedoms and our model of society.

Facial recognition is a uniquely invasive and dehumanizing technology, which makes possible, sooner or later, constant surveillance of the public space. It creates a society in which we are all suspects. It turns our face into a tracking device, rather than a signifier of personality, eventually reducing it to a technical object. It enables invisible control. It establishes a permanent and inescapable identification regime. It eliminates anonymity.

No argument can justify the deployment of such a technology. Besides anecdotal convenience (using your face rather than passwords to log in online or unlock your phone), its only effective promises are to hand over to the State a power of total control over its population — which it will be tempted to abuse against political its opponents and certain populations. Because facial recognition for security and surveillance purposes is by essence disproportionate, it is pointless to entrust with the responsibility of case by case evaluation an authority which would, inevitably, fail to track its numerous new applications.

This is why we ask you to ban any security and surveillance use of facial recognition. Such bans have already been decided in several cities in the United States. France and the European Union must go further, and, in keeping with the General Data Protection Regulation, build a European model that will respect its citizens’ freedoms. It will also be necessary to strengthen the requirements on protection of personal data and to limit the other uses of facial recognition. Be it for purposes of private authentication or identification, these systems on the whole do not offer sufficient protection of privacy, and they prepare, and normalize, a society of mass surveillance.

Together, we call for a ban on any security and surveillance use of facial recognition.

]]>https://www.laquadrature.net/en/2019/12/19/joint-letter-from-80-organisations-ban-security-and-surveillance-facial-recognition/feed/0[Dw] France embraces facial recognition techhttps://www.laquadrature.net/en/2019/11/18/dw-france-embraces-facial-recognition-tech/
https://www.laquadrature.net/en/2019/11/18/dw-france-embraces-facial-recognition-tech/#respondMon, 18 Nov 2019 10:00:00 +0000https://www.laquadrature.net/?p=15022Civil rights groups worry France is taking a step toward a surveillance state. It is about to become the first European Union country to introduce facial recognition software for government services. […]

But that’s exactly what Alicem doesn’t do, according to Martin Drago. He’s a legal expert at La Quadrature du Net, a group that defends digital rights and civil liberties and which is suing the government at France’s highest court of appeals.

“We think that France is infringing on the European General Data Protection Regulation, which bans facial recognition systems, apart from some exceptions,” he said. […]

]]>https://www.laquadrature.net/en/2019/11/18/dw-france-embraces-facial-recognition-tech/feed/0[Reuters] French government seeks to comb social media to fight tax fraudhttps://www.laquadrature.net/en/2019/11/15/reuters-french-government-seeks-to-comb-social-media-to-fight-tax-fraud/
https://www.laquadrature.net/en/2019/11/15/reuters-french-government-seeks-to-comb-social-media-to-fight-tax-fraud/#respondFri, 15 Nov 2019 10:00:00 +0000https://www.laquadrature.net/?p=15023[…] “An experiment without any goals is a joke,” said Arthur Messaud, a legal expert at French internet freedom advocacy group La Quadrature du Net. “We’re putting the cat among the pigeons by allowing the generalized monitoring of the Internet for everything and anything.” […)

French digital rights group La Quadrature du Net, which led the charge against the high school trials and against facial recognition in France more broadly, applauded the decision.

“The very principle of facial recognition could therefore be rejected: Too dangerous for our liberties, those automated systems should always be ruled out in the favor of human practices,” the group said. […]

]]>https://www.laquadrature.net/en/2019/11/06/politico-french-privacy-watchdog-says-facial-recognition-trial-in-high-schools-is-illegal/feed/0[TheStar] France set to roll out nationwide facial recognition ID programhttps://www.laquadrature.net/en/2019/11/04/thestar-france-set-to-roll-out-nationwide-facial-recognition-id-program/
https://www.laquadrature.net/en/2019/11/04/thestar-france-set-to-roll-out-nationwide-facial-recognition-id-program/#respondMon, 04 Nov 2019 10:00:00 +0000https://www.laquadrature.net/?p=14979France is poised to become the first European country to use facial recognition technology to give citizens a secure digital identity — whether they want it or not. […]

“The government wants to funnel people to use Alicem and facial recognition,” said Martin Drago, a lawyer member of the privacy group La Quadrature du Net that filed the suit against the state. “We’re heading into mass usage of facial recognition. (There’s) little interest in the importance of consent and choice.” The case, filed in July, won’t suspend Alicem. […]