2024 Social Media Safety Index

Solutions for All: Legislative and Regulatory Approaches to Social Media and Tech Accountability

The expertise, watchdog work, and guidance of tech accountability groups, researchers, and civil society organizations like GLAAD — urging social media companies to voluntarily improve safety, privacy, and expression on their platforms for LGBTQ people — continues to be vitally necessary. What is even more urgently needed is accountability via external oversight and regulation of the entire industry, to protect the public interest from the harmful business practices and product designs of social media companies. As GLAAD has previously noted, amidst the complexities of the many current U.S. legislative proposals intending to try to fix these problems, and especially the many proposals focused on youth safety, it is important that such approaches be carefully crafted lest they create potential harms and unintended negative impacts for LGBTQ people and other marginalized communities (and everyone). To preserve LGBTQ rights and safety, regulatory solutions should focus on addressing specific harmful business practices of tech companies, such as surveillance advertising and over-aggressive collection and misuse of user data, versus proposals that would actually expand collection of data and potentially expose people to government surveillance or lead to suppression and censorship of LGBTQ material. The chilling impact of 2018’s misguided FOSTA-SESTA (Fight Online Sex Trafficking Act and Stop Enabling Sex Traffickers Act) legislation should be an object lesson that attempts to solve problems can create grave new impacts, especially for historically marginalized communities. In the case of FOSTA-SESTA, the broad consensus from researchers, legal scholars, and even the Government Accountability Office is that, as historian Sascha Cohen writes in her in-depth overview in The Nation: “the law has been counterproductive at best and deadly at worst.” FOSTA-SESTA has especially impacted LGBTQ people and sex workers, resulting in a situation where, as Melissa Gira Grant pointed out in The New Republic, “The ‘solution’ became the problem.” (To learn more, read the aforementioned feature in The Nation). There are other approaches to addressing children’s safety online which should rightly put the burden on platforms to address these issues for all users.

Delving into why kids-only legislation won’t solve the problem of widespread online manipulation and harm, Nora Benevidez (lead author of the 2023 Free Press report, Big Tech Backslide) observes in a January 2024 Tech Policy Press article: “Comprehensive privacy and civil rights protections are possible. They would also avoid the pitfalls of well-intentioned legislation that claims to reduce harm and remove harmful content but actually exposes everyone, including children and teens, to more invasive practices and government overreach. The most powerful step we can take now to rein in online manipulation is to introduce and pass robust federal data-privacy legislation that would limit the data collected about, and then used against, all of us.”

Of course social media companies engage in business practices that prioritize their corporate profits and bottom line rather than prioritizing the best interests of society. This is a fundamental reality of all industries, and not a surprising one. The EPA, FDA, SEC, and other regulatory agencies came into existence for this very reason. Indeed, creating guidelines and oversight to ensure the public health and safety of the American people is not a radical idea. Although tech and social media companies may be under-regulated in the US, thankfully there are existing public agencies which do have some jurisdiction over the industry. For instance, the Federal Trade Commission (FTC) continues to protect the public interest through various mechanisms like its ongoing antitrust case against Meta or via imposing fines, as it has in Meta’s cases of repeated data privacy violations such as the Facebook Cambridge Analytica case. In May 2023 Samuel Levine, director of the FTC’s Bureau of Consumer Protection stated that: “Facebook has repeatedly violated its privacy promises. The company’s recklessness has put young users at risk, and Facebook needs to answer for its failures.”

Delving into the array of regulatory situations across the globe — including the work of the EU’s Digital Services Act (DSA), the Australian eSafety Commissioner and others — is beyond the scope of this report. But it is illuminating to see that other governments around the world are taking rigorous approaches to protect the public interest over the business interests of corporations, via regulation of tech and digital business models and practices.

While implementation of oversight may be a long and complex process, lawmakers must find solutions that do not create new problems, solutions that require companies to be accountable and transparent, solutions that protect us all.

More Publications from GLAAD

This report brings an academic and personal voice of the impact that the COVID-19 pandemic has had to-date on the fight to end the HIV epidemic, while also providing recommendations/ needs from people at community based organizations (CBOs) who serve and support the community. We underscore the disruption in access to HIV prevention and care services due to mitigation measures imposed in the early days of the COVID-19 pandemic, which will have implications for many years to come. We also highlight innovation to HIV service delivery that provided an important bridge between healthcare professionals and clients in an unprecedented time. Our recommendations will help sustain the fight against HIV in the United States in the midst of this pandemic, and future health emergencies.

Read More

stay tuned!