How Online Censorship Harms Sex Workers and LGBTQ Communities

FOSTA is bad, but holding online platforms liable for crimes they know nothing about won’t solve any problems FOSTA, a law addressing sex trafficking on online platforms has led to online censorship and endangered sex worker and LGBTQ communities.

Laura Moraff, Brennan Fellow, ACLU Speech, Privacy, and Technology Project

In 2018, Congress passed a law that aimed to address sex trafficking on online platforms. That’s a laudable goal, but the way Congress went about it has led to dangerous and disproportionate online censorship — particularly of the communities the law was trying to protect.

The law, the Allow States and Victims to Fight Online Sex Trafficking Act/Stop Enabling Sex Traffickers Act (FOSTA), hasn’t meaningfully addressed sex trafficking. Instead, it has chilled speech, shut down online spaces, and made sex work more dangerous. Now, courts are poised to interpret it in a way that could make these harms even worse — by imposing liability on online platforms that don’t even know their services are being used for sex trafficking. Along with the Center for Democracy & Technology (CDT), we recently filed three friend-of-the-court briefs on behalf of ourselves, CDT, and a number of LGBTQ+, sex worker, and startup advocacy organizations arguing that courts should not interpret the law in this way, because doing so would incentivize online intermediaries to censor more online speech — especially materials about sex, youth health, LGBTQ+ identity, and other important concerns.

A study of FOSTA’s effects on sex workers showed that it increased economic instability for about 72% of the study’s participants and nearly 34% reported an increase of violence from clients.

The criminal provisions of FOSTA have already chilled online speech. Those provisions created a carve-out to platform immunity under Section 230 for certain crimes related to sex trafficking. Just as the ACLU warned it would, that carve-out has caused online platforms to remove information that sex workers rely on to keep safe, and to shut down conversations about sex education and sex work, particularly by and for LGBTQ+ people. Instagram, for example, has deleted accounts that post about sex; meanwhile, some niche, free, and queer websites have shut down entirely. Online intermediaries have largely shut down “bad johns” lists compiled by sex workers to identify abusive clients, and many affordable ways to advertise sex work.

This has pushed people in the sex trades, who work in legal, semi-legal, and criminalized industries, off of online platforms and into dangerous and potentially life-threatening scenarios. Some sex workers have had to return to outdoor work or to in-person client-seeking in bars and clubs, where screening is more rushed than it is online, and where workers are more vulnerable to violence, police harassment, and HIV. In one study of FOSTA’s effects on sex workers, researchers found that the law had increased economic instability for roughly 72 percent of the study’s participants, and nearly 34 percent reported an increase of violence from clients.

Against this backdrop of harm, FOSTA has not led to more prosecutions of sex trafficking. Despite the expanded definition of sex crimes under FOSTA, a June 2021 report by the Government Accountability Office showed that federal prosecutors had not used the additional criminal penalties established by FOSTA. At that time — more than three years after the passage of the law — the DOJ had prosecuted only one case using FOSTA. Indeed, it appears FOSTA made it more difficult to gather evidence and prosecute those who use platforms for sex trafficking purposes, in part because platforms relocated overseas.

Now, courts may be poised to impose liability on online platforms that don’t even know their services are being used for sex trafficking.

As if FOSTA hasn’t already caused enough harm, courts are now poised to make things worse by interpreting the law in a way that would open platforms like Twitter, Omegle, craigslist, and Reddit to liability for sex trafficking activities that occur on their services, even if the platforms have no actual knowledge of such activities. If the courts decide that platforms can face liability merely because they “should have known” that sex trafficking was occurring on their websites, many online platforms will inevitably respond by policing their users and taking an even more aggressive approach to content moderation. In particular, they will likely remove and censor more lawful content related to sex and sexual health, including by using blunt automated tools that tend to perpetuate real-world biases and are unable to understand context.

Imposing liability without actual knowledge would not only exacerbate real-world harms, but it would also raise serious First Amendment questions. As the Supreme Court recognized more than 60 years ago, when considering whether to impose liability on booksellers for the contents of the books they sell without requiring some level of knowledge, distributors “will tend to restrict the books [they] sell[] to those [they have] inspected,” and, as a result, “the contents of bookshops and periodical stands . . . might be depleted.” Because online intermediaries — from Google to Twitter to WhatsApp to Amazon Web Services, and other providers who will be governed by the outcome of this case — facilitate the publication and spread of billions of pieces of content every day, this concern is even more serious when it comes to online speech.

We urge the courts to avoid imposing liability on platforms when they do not actually know that particular crimes are occurring on their websites. Doing so will help ensure that all people, especially the LGBTQ+ community, sex workers, and sex health educators, can freely and safely express themselves online.