Shall the voters of the City and County of Denver adopt an ordinance that bans the City from using any facial recognition surveillance system and from using data derived from such a system in certain proceedings, and, in connection, allows a person to initiate a legal proceeding to enforce the ordinance?
Facial recognition surveillance programs exhibit
gender and racial biases,
which lead to deportation, harassment, and wrongful imprisonment.
Facial recognition surveillance programs carry a
high rate of false-positive identifications.
Facial recognition surveillance programs collect and store biometric data, creating an appetizing
target for
identity thieves and state-sponsored hackers.
Facial recognition surveillance programs have no place in Denver.
The aim of this initiative is to put the question of municipal use of facial recognition technologies to Denver voters on the November 2020 ballot.
The language used in the proposed ordinance can be found at this link.
The initiated ordinance process in Denver outlines 7 steps to list a ballot measure. These steps, and their respective statuses, will be listed below.
Milestone | Status |
---|---|
|
Completed 8/27/19 |
Attended 9/11/19, ordinance revised | |
Submitted 10/16/19 | |
Final draft approved 11/06/19 | |
Circulate petitions |
4,875 / 8,265 signatures |
File completed petitions | Signature count insufficient at May 4th deadline -- campaign suspended |
Elections Division determines petition sufficiency | Signature count insufficient -- campaign suspended |
Facial surveillance recognition is a technology that scans people in a video feed and uses an
algorithm to compare still images to those in a reference database or on a watch list.
The list of users includes the larger agencies like the FBI, NSA, and NYPD, but many local police
forces are developing or integrating their own systems as well. It’s a wide-reaching network:
In 2016, one in two American adults were in a law enforcement face recognition network.
In the ballot title, found here,
it is simply defined as: an automated or semi-automated process that assists in identifying or verifying an
individual,
based on the physical characteristics of an individual’s face.
In addition to the fundamental notion of government-ordained facial tracking being distasteful,
the technology itself isn’t very effective. The facial surveillance recognition systems of today have a
worryingly high rate of false positives, instances where a match is detected but turns out to be wrong.
In London, the Metro Police have reported that
98% of the matches in their trial software were inaccurate.
Additionally, false positives are not evenly distributed among the population. MIT and Stanford researchers
presented a
paper last year stating that facial-analysis programs demonstrate both skin-type and gender biases.
That’s because this technology relies on neural nets to analyze a library of faces and generate facial
distinction points.
The datasets most of these systems rely on overrepresent lighter-skinned men which means when a non-model
individual is processed through
the algorithm, they’re compared against fewer distinction points which leads to a higher rate of false positives
and increases the
risk of systemic abuse among these populations.
It is not the intent of this ordinance to disrespect law enforcement agents. The job they do is often difficult and thankless and making their jobs easier should be a priority. Facial surveillance recognition does not make the lives of police easier. It’s an expensive system to integrate, it’s vulnerable, it’s open to abuse, but most of all it’s largely ineffective at its stated purpose. That may not be the case in 10 years, but it is the case now.
It is the intent of this initiative to prevent the Denver municipal government from utilizing facial recognition
software or
information gathered from facial recognition software from any source. This would mean that city officials at
Denver International Airport ("DIA"),
which is operated by the city's Department of Aviation, would be prohibited from legally using information
obtained from federal partners, like
the Transportation Security Administration ("TSA"), Federal Bureau of Investigation ("FBI"), and Customs and
Border Protection ("CBP").
Additionally, this would prevent the use of such technologies in civillian agencies, like Denver's Motor Vehicle
Division, which might utilize
such a system in their driver's license database.
Yes, San Francisco, Oakland, and Somerville in Massachusetts have already passed local ordinances banning facial recognition. Denver would be the fourth city nationwide to implement such a prohibition.
It is not the intent of this ordinance to interfere with a private entity’s development or usage of this technology. However, under the proposed ordinance, the Department of Aviation, and any other municipal agent in the City and County of Denver, would be prohibited from utilizing information obtained from a system if it qualifies as facial recognition surveillance.
The only exemption listed in the ordinance text is the use of such a system for the purpose of redacting a recording for release or disclosure to protect the privacy of a subject depicted in the recording.
Any violation of this Ordinance constitutes an injury and any person may institute proceedings for injunctive relief, declaratory relief, or writ of mandate in any court of competent jurisdiction to enforce this Ordinance. An action instituted under this paragraph shall be brought against Denver and, if necessary to effectuate compliance with this Ordinance, any other governmental agency with possession, custody, or control of data subject to this Ordinance.
A wider ban allows any infringement upon the ban, for good or otherwise, to be accessible to the public. If Denver's city council elects to repeal all or part of the ban, the general population would be able to react in an appropriate and timely manner. Lighter regulations do not provide such a signalling mechanism and could even be used to expand a facial recognition dragnet.
The definition of “facial recognition surveillance” is any automated or semi-automated process that assists in identifying or verifying an individual, based on the physical characteristics of an individual’s face. There is no reason to believe that any reasonable body or person would, based on that definition, interpret more mundane video surveillance systems, like CCTV, as a form of facial recognition surveillance.
The intent of the proposed ordinance is not to limit Denver’s ability to share legally-obtained evidence or identifying information with other law enforcement bodies. However, if that evidence comes from a source classified as a facial recognition surveillance system then it would not be exempt from the ordinance's prohibition.
Yes. Obtaining social media facial data as part of a search warrant would not be restricted but using it towards securing an arrest or charging an individual with crimes would be prohibited.
Personal home security devices would not be subject to this ordinance. However, municipal law enforcement agencies would only be able to utilize such data if it did not meet the definition of a facial recognition surveillance system.
We need evangelists, petitioners, and, above all, signatures. The initiative is also accepting monetary contributions via PayPal to help pay for marketing materials and any petitioner/tabling fees for local events. Use the form below to identify how you wish to contribute.
You still have the power to help. Click on the map below to see what's happening near you and learn how you can contribute.
The owner of this page may be reached at 5280not1984@gmail.com