Last week, the Arkansas legislature passed HB 1570, a bill that will ban gender-affirming health care for trans youth in the state. This came just one day after Arkansas’ Gov. Asa Hutchinson vetoed the bill, and amidst pleas from doctors, social workers, and parents of transgender youth to protect health care for trans youth. Arkansas is the first state in the country to pass such a bill.

“This [bill] was based not on science, not on compassion, not on shared humanity, but on just this sheer desire to harm trans people, and that was so apparent from the start,” ACLU’s Deputy Director for Transgender Justice Chase Strangio told us on At Liberty. “What the state has done is essentially saying ‘We’re going to take away the single most helpful thing to to save lives in this community and not only make it difficult to access, make it prohibited.'”

Strangio helped us unpack this legislation in the context of a year in which more than 100 bills that target and harm trans people have been introduced in legislatures across the country.

Special Report: Chase Strangio on the Legislative Assault on Trans Youth

Date

Thursday, April 15, 2021 - 4:00pm

Featured image

A demonstrator holding a sign with the text "Stop HB1570."

Show featured image

Hide banner image

Tweet Text

[node:title]

Share Image

ACLU: Share image

Related issues

LGBTQ+ Rights Students & Youth Rights

Show related content

Imported from National NID

40499

Menu parent dynamic listing

22

Imported from National VID

50466

Imported from National Link

Show PDF in viewer on page

Style

Standard with sidebar

Teaser subhead

At Liberty unpacks a year in which more than 100 bills attacking trans people have been introduced across the country, and where the ACLU is focusing its firepower to protect trans youth.

Jeremy Shur, Student Attorney, University of Michigan Law School Civil Rights Litigation Initiative

Deborah Won, Student Attorney, University of Michigan Law School Civil Rights Litigation Initiative

Last year, Detroit police arrested a Black man, Robert Williams, based on a false face recognition match. They held him in a dirty, overcrowded cell for 30 hours until they realized that “the computer got it wrong.” Unfortunately for Williams, it was too late. He had already been arrested in front of his family and missed work because he was in jail for a theft he did not commit. And even after he was freed, Williams still had to hire a lawyer and go to court to defend himself.

Today, the ACLU joins the University of Michigan Law School’s Civil Rights Litigation Initiative and the ACLU of Michigan in filing a federal lawsuit charging that the police violated Williams’ rights protected by the Fourth Amendment and Michigan’s civil rights law. The lawsuit seeks damages and demands that the Detroit Police Department institute policy changes to halt the abuse of face recognition technology.

It is well documented that face recognition technology is deeply flawed. The technology has a disturbing record of racial bias against people of color and other marginalized groups. Many jurisdictions ban its use for that reason, among others. Face recognition is especially unreliable when attempting to identify Black people, when the photo used is grainy, when the lighting is bad, and when the suspect is not looking directly at the camera. All of these circumstances were present in the photograph that the Detroit Police Department used in its shoddy investigation, and are common in the type of photographs routinely used by police officers around the country when deploying face recognition technology.

Despite the technology’s well-known flaws, Detroit police relied on it almost exclusively in their investigation. They did not perform even a rudimentary investigation into Williams’ whereabouts during the time of the Shinola shoplifting incident. If they had, they would have realized that Williams was not the culprit he was driving home from work outside of Detroit at the time the incident took place. Instead, the police asked an individual who was not even at the store to “identify” the culprit in a six-person photo lineup. The individual had no basis for being asked to identify the suspect: She was not a witness to anything except the same grainy image taken from surveillance video that the police already had. Worse still, the identification process was supposed to be a blind procedure in which the officer who conducts it doesn’t know who the suspect is to avoid tipping off the witness, but the officer who facilitated it already knew that Williams was the suspect.

After the “witness” falsely identified Williams, the police submitted a faulty and misleading request for an arrest warrant. They did not include the probe image used to generate the faulty face recognition “match,” nor did they disclose that the image was blurry, dark, and showed an obstructed, barely-visible face turned away from the camera impermissible conditions for a face recognition search by even the police department’s own standards. They also failed to mention that face recognition technology is known to be faulty under these circumstances. Nor did they disclose that the image of Williams that “matched” with the culprit’s was actually his expired driver’s license, rather than the most current image of him on file with the state. Moreover, the police did not mention that the individual who picked Williams out of the lineup had never actually seen the shoplifter in person. And perhaps most egregiously, the police did not explain that both the unknown suspect and Williams are Black, and that face recognition technology is well known to produce significantly higher rates of false matches when used to try to identify images of Black people, as compared to white men.

This violation of the Williams family’s rights is a stark example of the dangers of this technology, especially when used by law enforcement against people of color. Government use of face recognition technology not only poses a danger of turning our streets into a surveillance state, but it also threatens integral aspects of our criminal legal system.

The Fourth Amendment guarantees the right to not be searched or arrested without probable cause. When police use face recognition technology to investigate a crime, they are using a flawed, racially biased tool that does not create the “individualized” suspicion required to establish probable cause. Instead, it turns everyone into a suspect by placing us all in the virtual lineup every time the police investigate a crime. So when face recognition technology produces a “lead,” police officers and magistrates would be unwise to trust it, especially if the person in the probe image is Black. Had the police officers who arrested Williams taken the time to compare the probe image photo to Williams’ appearance, they would have realized that the two individuals were not the same.

Humans, including police officers and judges, have a hard time understanding just how frequently computers make mistakes. That means that, where face recognition technology is involved, judges may be more likely to believe that the officers established probable cause — a dangerous outcome that could lead to untold numbers of false arrests. Anyone who questions whether people will unthinkingly rely on what the computer tells them to do should think about the last time they turned the wrong way because Google Maps told them to do so, even when they knew it was a wrong turn.

The flaws of face recognition technology also seep into the rest of our criminal legal system. Defendants often do not ever find out that they were accused by a computer. Without this knowledge, defendants are denied their constitutional rights to present a complete defense at trial — they cannot point to the flaws of the accusatory computer that led to their arrest or understand how those flaws might infect the subsequent investigation.

Further, even if defendants do learn that a computer accused them, they face additional hurdles throughout the system. While defendants are supposed to be given access to information about the flaws of human witnesses pursuant to the Supreme Court’s decision in Brady v. Maryland, they are routinely denied that same information about the flaws of technological accusers that serve similar roles in the investigation and prosecution of a crime. Similarly, while defendants are allowed to cross-examine human witnesses at trial under the U.S. Constitution’s Confrontation Clause, defendants are often prevented from similarly testing the reliability of computers that produce testimony against them at trial. Police use of face recognition technology is insidious in any context, and its flaws only become more compounded throughout the life of a defendant’s criminal case.

Williams’ case makes it painfully clear: Face recognition technology turns everybody into a suspect and threatens our civil rights. We cannot allow flawed technology to participate in an already-flawed criminal legal system — especially when an individual’s life and liberty are at stake.

Date

Tuesday, April 13, 2021 - 10:15am

Featured image

Facial recognition software scanning a face.

Show featured image

Hide banner image

Tweet Text

[node:title]

Share Image

ACLU: Share image

Related issues

Privacy Racial Justice Police Practices

Show related content

Imported from National NID

40463

Menu parent dynamic listing

22

Imported from National VID

40484

Imported from National Link

Show PDF in viewer on page

Style

Standard with sidebar

Teaser subhead

Face recognition technology turns everybody into a suspect and threatens our civil rights.

Jennifer Stisa Granick, Surveillance and Cybersecurity Counsel, ACLU Speech, Privacy, and Technology Project

UPDATE (7/27/21): The ACLU and ACLU of Texas filed a new amicus brief arguing for robust constitutional limits on searches and seizures of smartphones and computers. In this case, United States v. Morton, pending in the Fifth Circuit, investigators got a warrant to search a man’s cell phone merely because he was allegedly in possession of drugs, and then searched the photographs on the phone. Our brief argues that the Fourth Amendment does not permit searches without a factual basis and limits investigators to looking only at categories of data where evidence could be found.

Think about all the information stored on your cell phone or computers — photos, texts, location data, and even more information generated by a multitude of apps. These tools are convenient and integral to our everyday lives. But what does that mean for our privacy rights in the digital age? Today, the Wisconsin Supreme Court hears oral arguments addressing this question, including arguments that the ACLU has presented in a series of amicus briefs.

The Wisconsin case concerns cell phone data seized by the police. During a hit-and-run investigation in 2016, a suspect, George Burch, consented to a police search of his text messages to confirm his alibi at the time of the accident. The police should have just downloaded the text messages and returned the information after the investigation ended. Instead, they downloaded the entire contents of Burch’s phone and held onto all the data. Months later, an entirely different law enforcement agency found out that police still had the information and searched it for evidence of Burch’s involvement in a murder — without getting a separate warrant or asking his permission. That search — which went well beyond the agreed upon text messages and hit-and-run investigation — led to evidence linking Burch to the murder.

If police want to lawfully access seized data for a separate investigation, at the very least they must obtain a second search warrant. Without a warrant, there is no way to ensure that the police are acting with good cause, rather than just rummaging through private data for improper reasons, or for no reason at all. Once an investigation is over, the police are required to return or delete information seized.

The ACLU has been fighting unconstitutional search and seizure practices by law enforcement for years. In 2015, we filed an amicus brief in United States v. Ganias, a case in the Second Circuit Court of Appeals concerning the FBI’s retention of electronic records seized in a fraud investigation, which it held onto and later searched to investigate a different fraud suspect over two years later. While a panel of judges agreed with us that the retention of this data was unconstitutional, the full Second Circuit did not rule on this particular legal question, and the problem persists.

In 2014, we filed an amicus brief in United States v. Hasbajrami, a case involving warrantless foreign intelligence surveillance. In that case, the Second Circuit issued an opinion holding that subsequent querying of stored information may, depending on the circumstances, be regulated by the Fourth Amendment and require a warrant.

In the past year, we’ve filed two other amicus briefs in state Supreme Courts on the same issue — in these cases, as in Burch, police didn’t bother to get a second warrant.

In State of Michigan v. Hughes, police seized the defendant’s cell phones in conjunction with a drug trafficking offense to which he pleaded guilty. The defendant was also suspected of armed robbery. A jury hung failed to convict him not once but twice. Unsatisfied, at the third trial, the prosecution introduced evidence obtained by searching the cell phones for evidence of the robbery, a crime for which they did not have a warrant.

In State of Illinois v. McCavitt, law enforcement obtained a search warrant to investigate a police officer for several crimes against a single victim. Eight months later, the officer was acquitted. The next day, law enforcement — still in possession of the defendant’s hard drive under the first warrant — conducted a new search of the hard drive data, hoping to find evidence of different crimes against additional victims.

As these three state Supreme Court cases illustrate, courts are starting to scrutinize broad, free-wheeling searches of the libraries’ worth of sensitive, private information stored on our electronic devices. None of them have ruled yet. But our briefs explain that the Fourth Amendment can and must effectively limit searches and seizures in the digital age, preventing the founders’ reviled “general search” — rummaging through private data for any or no cause and without judicial oversight. Nor should our private information stay in government hands forever. At some point, the constitutional requirement of reasonableness means that our documents, photos, and messages get returned or deleted.

The government should not take advantage of its investigatory powers to build permanent digital dossiers just in case. To retain this information is a moral hazard, putting privacy and other civil rights and civil liberties at risk of an all-seeing government eye.

 

Date

Monday, April 12, 2021 - 2:15pm

Featured image

Hand holding smartphone

Show featured image

Hide banner image

Tweet Text

[node:title]

Share Image

ACLU: Share image

Related issues

Police Practices Privacy

Show related content

Imported from National NID

40447

Menu parent dynamic listing

22

Imported from National VID

108060

Imported from National Link

Show PDF in viewer on page

Style

Standard with sidebar

Teaser subhead

Courts are starting to scrutinize free-wheeling searches of the libraries’ worth of private information stored on our electronic devices.

Pages

Subscribe to ACLU of Florida RSS