2024 Social Media Safety Index

Spotlight on Data Protection

The need for robust, federal data protection — and how it would help protect LGBTQ people online and off

A non-binary person looking at a cellphone
Gender Spectrum Collection

As in years prior, the 2024 Social Media Safety Index Platform Scorecard highlights the importance of robust data safeguards. As GLAAD and other digital rights experts have long noted, the tech industry is largely failing to self-regulate and continues to prioritize profit over public safety and human rights; this includes a lack of adequate data protection. The Electronic Privacy Information Center (EPIC), for example, writes: “social media companies harvest sensitive data about individuals’ activities, interests, personal characteristics, political views, purchasing habits, and online behaviors. In many cases this data is used to algorithmically drive user engagement and to sell behavioral advertising—often with distortive and discriminatory impacts.” This harms LGBTQ people online, and off.

When social media companies exploit our personal information for profit, platforms become less safe, especially for marginalized communities who are disproportionately harmed by targeted surveillance and disinformation. This lack of data privacy protections helps fuel the hate and disinformation ecosystem — known as surveillance capitalism — particularly through media manipulation campaigns that drive polarization and animus. Platforms use our personal information to algorithmically propel user engagement and to sell advertising, including targeting people based on personal characteristics and behavior. In 2022, the United Nations wrote: “Freedom of expression and the right to privacy are among the human rights most impacted by the digital transformation. Interrelations between these rights have too been transformed.”

All of this is largely happening in the dark, as there is little transparency into data collection and usage practices, coupled with convoluted privacy policies. Platforms should clearly disclose user options to control the collection, inference, and use of information related to their sexual orientation and gender identity, and should not allow third-party advertisers to target people based on that data.

There are also salient connections here with regard to how LGBTQ people and other marginalized groups face disproportionately higher risks of state surveillance. In the US and around the world, police have been known to engage in online targeting of LGBTQ people, including via their social media activity. Law enforcement authorities in countries including Egypt, Iraq, Jordan, Lebanon, and Tunisia have also reportedly used social media monitoring to track and persecute LGBTQ users. Human Rights Watch wrote about the disturbing trend in January 2024: “Security forces have entrapped LGBT people on social media and dating applications, subjected them to online extortion, online harassment, doxxing, and outing; and relied on illegitimately obtained digital photos, chats, and similar information in prosecutions. In cases of online harassment, which took place predominantly in public posts on Facebook and Instagram, affected individuals faced offline consequences, which often contributed to ruining their lives.” In these cases, strong data protection regulations and enforcement, as well as continuous human rights impact assessments, would potentially help reduce harms.

Some companies (and regulatory agencies) are exploring meaningful solutions to improve industry standards. For example, decentralized social media platforms like Mastodon and Bluesky have opened industry possibilities for less exploitative data collection practices, offering users more control. In December 2023, following pressure from LGBTQ and reproductive rights advocates, Meta enabled end-to-end encryption on Messenger, strengthening privacy on that platform. On the other hand, as Accountable Tech has pointed out, Google is still failing to protect the location privacy of people who seek reproductive healthcare (a problem of potential significant concern for those seeking trans healthcare). Meta may also soon face a ban on the processing of personal data for targeted advertising in the EU, which implemented a national data protection regulation (known as the GDPR) in 2018.

The U.S. doesn’t have a federal data protection law, something that civil society has long called for, prompting nearly a dozen U.S. states to pass consumer privacy laws in recent years (many of which tech companies have lobbied against). In April 2024, federal lawmakers introduced the American Privacy Rights Act (APRA), a sweeping, bipartisan proposal to adopt national online privacy protections and give consumers more power over their data. In 2022, the FTC also launched a rulemaking process on corporate surveillance and data security, and in 2023, the FCC created a privacy and data protection task force. To help truly protect everyone from Big Tech’s business models, we need strong, and thoughtfully crafted, data protection oversight today.

More Publications from GLAAD

The GLAAD Studio Responsibility Index (SRI) maps the quantity, quality and diversity of lesbian, gay, bisexual, transgender, and queer (LGBTQ) people in films released by the seven major motion picture studios during the 2016 calendar year. GLAAD researched films released by 20th Century Fox, Lionsgate Entertainment, Paramount Pictures, Sony Pictures, Universal Pictures, Walt Disney Studios and Warner Brothers, as well as films released by four subsidiaries of these major studios. The report is intended to serve as a road map toward increasing fair, accurate and inclusive LGBTQ representation in film.

Read More

stay tuned!