girl holding phone | social media

The Algorithm Police: Are Social Media Policies Biased?

Social media algorithms may be biased against plus-sized bodies, LGBTQ+ people, and women of color.

Caricature of Ella Dorval Hall

By Ella Dorval Hall

February 20, 2020

We frequently get caught up in assessing the negative effects of social media for young people, particularly when it comes to their sexuality. Yet it is also common to hear from research and LGBTQ+ folx the importance of the internet for developing identity, community, and self-exploration.

We frequently get caught up in assessing the negative effects of social media for young people, particularly when it comes to their sexuality. Yet it is also common to hear from research and LGBTQ+ folx the importance of the internet for developing identity, community, and self-exploration.

As Amber Leventry describes in “The importance of social media when it comes to LGBTQ kids feeling seen,” “the Internet can be a refuge—a safe place to feel less alone. For queer youth to feel normal, they need to see, read, and hear the voices of others who look like them and use the same identifying labels.” Leventry describes social media as a space for people to “find solace—and the acceptance they lack—in their phone or laptop. (See also, our infographic, Growing Up LGBTQ+ Online.)

This means that LGBTQ+ people must be represented online in order to have positive experiences.

So, what if you could no longer find LGBTQ+ content on social media? What if this resource that provides a refuge for people with marginalized identities no longer represented them?

Though these questions may be a little extreme, Salty, a magazine that fights “for the digital visibility for women, trans, and non-binary people,” explores just how biased algorithms may be in censoring the content of LGBTQ+ people.

These biased policies are just one example of the questions we’ll explore at #HealthyTeen20, our conference to be held November 16-18, 2020 in Portland, Oregon.

In response to conversations that “certain bodies and perspectives are being policed, and how certain people are targeted for censorship more than others on Instagram,” Salty published a report on “algorithmic injustice.” The report states that Facebook and Instagram are biased against “women-focused businesses, sex workers, queer people, BIPOC, and plus-sized people.”

Salty attributes this to the fact that the “algorithms are based on policies, and policies are created by humans—humans with bias.” After surveying 118 Instagram users, they report that “queer folx and women of color are policed at a higher rate than the general population” and “plus-sized and body-positive profiles were often flagged for ‘sexual solicitation’ of ‘excessive nudity.’”

In addition, after Facebook and Instagram sent their advertising policies for underwear and swimwear to Salty, Salty reported that there were 22 factors that outlined how “models can sit, dress, arch their backs, pose, interact with props, how see-through their underwear can be, how the images can be cropped and where their ads can link to.” Possibly most startling, is that all of these policies explicitly target women, and that there are no policies pertaining to men’s bodies.

So, what are the implications of potentially “biased algorithms” and policies?

If plus-sized bodies, LGBTQ+ people, and women of color are censored at a higher rate, it may signify these folx are underrepresented, harder to find on social media, or experience more policing of their sexuality. While this could be particularly challenging for young LGBTQ+ people who seek out social media to feel seen and connected to community, it also means that all people are continuing to consume a culture of thin, white, cisgender, and heterosexual bodies.

And by doing this, it calls into question what people this continues to give power to, who it limits access to, and what sexual identities and body types it frames as unacceptable.

What do we do?

As someone who interacts with youth, ask them what they think of the content on their social media feed, and how they see sexuality on Instagram or Facebook. Help prepare them with the tools to understand how gender, sex, power, and race are present in their lives and inform their decision making. If you need more resources, take a look at Healthy Teen Network’s Youth 360º approach that is designed for professionals working with youth to understand how the relationships, community, and society we live in affects how young people make decisions about their sexual health.

Have something to share?

These biased policies are just one example of the questions we’ll explore at #HealthyTeen20, our conference to be held November 16-18, 2020 in Portland, Oregon. How are you helping young people to develop healthy sexualities while living in this connected age? Share your story: submit a proposal for a workshop, panel, or poster presentation.

Ella Dorval Hall was previously employed with Healthy Teen Network as a Capacity Building Specialist.

Comments are closed.

Discover the magic of the Network.

Want to do something similar?

You work hard to meet the needs of young people. We’re here to support and inspire you to do your best work every day.

Want more?