Apple and Google still have an LGBTQ problem

Silicon Valley should get out of the business of regulating sexually explicit content – its actions against NSFW materials put marginalised groups at even more risk

If you buy something using links in our stories, we may earn a commission. This helps support our journalism. Learn more. Please also consider subscribing to WIRED

Being afraid that someone, somewhere, is having sex isn’t just for parents of teens at sleepovers and conservative politicians anymore. In June, Apple announced tweaks to its policies regarding hookup apps, prohibiting apps that “contain[ed] pornographic content or that facilitate prostitution”. But after critics raised the alarm, wondering how far these restrictions would go and whether they would include removing Grindr, an app primarily used by gay men that is famous for the commonalities of dick pics and hook-ups, Apple backtracked.

Apple said the change merely codified existing policies, and it would only apply to hookup apps that “primarily focus on pornography” or “that facilitate human trafficking under the guise of being used for hookups.” Apple hasn’t provided any examples of what these apps may be. And it is not the only Big Tech company that has a problem with sex. At the end of July, Google announced it would also crack down on apps and content that “may be interpreted as promoting sexual acts in exchange for compensation”. This is on top of its already overly restrictive stance on nudity

And then just on August 5, Apple announced a new technical mechanism for ensuring “child safety” that looked like it would present teenagers with the option of either not seeing material that a machine learning algorithm flagged as nudity or reporting it to their parents. The company later said the new parental notifications will only apply to children under the age of 13.

Apple's “clarification” of its policies around hookup apps, its missteps regarding queer teenagers, and Google’s new rules are just another example of why Silicon Valley should get out of the business of regulating sexually explicit materials. The vague guidance and inconsistent standards used against apps harm LGBTQ people, especially those who are using dating apps to find community, love, or plain ol’ hook-ups in countries with explicitly anti-LGBTQ laws. And because Apple, in particular, uses the App Store to police content on an app-wide level, companies that wish to have access to people with iPhones have no choice but to comply.

Attempts to limit sexually explicit speech tend to (accidentally or on purpose) harm LGBTQ people more. Many of these rules are aimed in particular at sex workers, and LGBTQ people are disproportionately likely to use sex work as means of survival. A recent report from Fight for the Future, a digital rights group, argues that Apple often outright refuses to provide access to apps that primarily serve LGBTQ people.

But even neutral rules, which are supposed to apply to everyone equally, can be problematic. First, there is often a double standard regarding what counts as sexual content. A kiss between two lovers may count as PG if between a man and a woman, but if two men are kissing, the same act can be viewed as not safe for work. This phenomenon is well documented historically, in contexts as diverse as obscenity trials in Cold War Los Angeles and YouTube channel demonetisation.

Neutral rules are also often weaponised against LGBTQ community members, often specifically transgender or gender non-conforming people. As far back as 2012, scholars have documented communities using “flagging” or user reporting to police gender expression. And as Jillian C. York, the Electronic Frontier Foundation’s director for international freedom of expression, has discussed, once a post is flagged, it can be nearly impossible to actually understand why it was removed or open a dialogue about how to comply with (often opaque) community standards.

Dating apps and other spaces that combine sexual content with other forms of sociality can serve as important community meeting spaces for queer folks, both in regions where they are at risk, and not. Think of a gay club or a lesbian bar – although some people may be there to meet others (or exchange nudes), others might show up to connect with a broader community. Apple’s restrictions on Tumblr, which likely played a causal role in its elimination of NSFW content from the site, fragmented queer communities. More recently, chat service Discord limited access to NSFW-marked servers from iOS, blaming the App Store’s guidelines. This policy effectively harmed a number of primarily queer meeting spaces, throwing communities into disarray before the rules were loosened.

But the effects on western LGBTQ people are just the tip of the iceberg. These restrictions and ban implementations have much broader effects on populations often left in the peripheries when such policy changes are considered. These restrictions on sexually explicit material in apps can harm queer folks in higher risk environments, for example in the Middle East and North Africa (MENA) region.

Queer dating apps are uniquely important for MENA queer communities, representing the most intimate and fragile interaction between people and their technologies. When meeting in physical spaces is highly risky and can lead to arrests and prosecutions, virtual spaces are fundamental. This is especially the case for members of the community who live outside the larger cities and those who may not have access to queer NGOs or community centres. Despite the fact that queer dating apps are sometimes used to target and arrest LGBTQ people, a 2018 report from human rights organisation ARTICLE 19 found that 60 per cent of the research respondents said they will continue to use the apps. The drive for sex, love, intimacy, and association is stronger than the fear of the risks.

Apple’s policy change is unlikely to affect the way larger queer dating apps function. Grindr, Bumble, and Tinder have advanced policy and legal teams who can keep their apps functioning. The new rules and crackdowns on sexually explicit material are likely to affect smaller, lesser-known apps.

And it’s those smaller apps that matter the most to high-risk users. In the unpublished raw data for the ARTICLE 19 report, we identified at least 22 different major dating apps being used in Iran, Lebanon and Egypt in 2018 (it is likely much higher now). In certain MENA countries, especially Egypt, vice police have been using queer dating apps to arrest and identify individuals. In 2020 alone, Egyptian LGBTQ NGO Bedayaa reported that 47 per cent of arrests that year were via app entrapments, for example. To bypass police, many people maneuver between apps or switch to apps that are less well known, and thus less surveilled.

In Egypt and many similar contexts, sex workers use app switching and lesser-known apps to avoid law enforcement and for the necessities of supply and demand. “Sex workers in Egypt use less popular apps to avoid the police or based on their customer base needs, and they also switch between apps,” says Nora Noralla, a researcher and executive director of Cairo 52, an Egyptian human rights organisation. “It’s important to have this option otherwise the police know exactly where to find people,” she says when asked about the impacts of Apple’s policy change. “More work needs to be done to make the smaller apps more secure, not more work to cut the number of available apps.”

Sex workers are not the only ones switching apps to stay safe – queer populations across MENA use the same techniques. “The queer community, they use Grindr to meet up,” one lawyer told us in an interview. “If there are security alerts, then they would use more private applications.”

If Grindr is the only app to survive Apple’s policy changes, that option of switching between apps disappears. It is unclear whether any apps have been removed since the policy change was announced, and because the App Store checks apps for compliance when they are updated, we may not know for a while what enforcement might look like. But we don’t need to predict the future to see what would happen when smaller apps run afoul of Apple’s rules – we can look at what's already happened. Sanctions on Iran have already removed access to certain essential technologies used by marginalised groups. And that’s led to further isolation, risks, and discrimination, along with the already significant human rights abuses by the government.

“These bans and limits [...] have decreased my access to communication and networking tools,” one Iranian queer person told ARTICLE 19. Another says the apps remain viral, despite the risks some can pose. “I knew some of the accounts on these apps were fake, however, when I had access to them, I had this hope that there’s someone like me, someone living nearby that I could reach out to,” they say. [The] decision to ban Iranian users has killed that hope.”

Ro Isfahani, a journalist with a focus on Iran, also believes that “queer people in high-risk environments [will continue to bear] the brunt of these policies”. He was clear about the potential consequences of these seemingly technical changes, saying “this new policy is certain to further curb queer folks access to safe spaces that they have nurtured despite risks to their lives”.

Apple and Google are not the only companies afraid of sex. For decades, advocates who equate all sexually explicit materials with exploitation or trafficking, such as the National Center on Sexual Exploitation, formerly Morality in Media, have suggested that the best thing to do is ban sex on the internet entirely. Recently, anti-sex campaigners have been remarkably successful, driving payment card processers such as Mastercard and Visa to drop PornHub or Instagram to equate sexually explicit content with violence as part of their sensitive content restrictions

Some of that has been as a result of (actual or manufactured) litigation fears stemming from the United States's passage of the Fight Online Sex Trafficking Act, or FOSTA, in 2018. The law, which claimed to be about preventing sex trafficking, made changes to Section 230, a bedrock law that limited liability for online platforms. Although these changes were minor, they caused significant upheaval in terms of companies' willingness to host sexual content. Other events, such as the closure and prosecution of Backpage, created limited options for consenting adults who wanted to engage with sexual material online. But Apple’s policy changes, in particular, sweep far broader than what FOSTA covers. (Google’s hew closer to the law.)

Google and Apple's intentions may be good – people deserve to be able to curate what they see on their phones and should only see sexually explicit content that they consent to. And advocates have been working to make apps safer for high risk users. But kicking services out of the App Store because of the presence of NSFW materials forces apps to choose between policing people and not being able to reach them. When they do that, LGBTQ people, across countries and contexts, lose.

Afsaneh Rigot is a senior researcher at ARTICLE 19 and a Technology and Public Purpose fellow at Harvard Kennedy School’s Belfer Center. Kendra Albert is a clinical instructor at the Harvard Law School’s Cyberlaw Clinic.


More great stories from WIRED

This article was originally published by WIRED UK