May 28 2024

Youth Tell the Truth About Safety Tools: Advice on How to Improve These Tools From Actual Teens

Post By: Safer / 6 min read

Learn
Blog

Platform safety tools — like blocking and reporting — are often a child’s first choice for responding to a harmful sexual interaction online. Instead of seeking support from a parent or caregiver, kids tell us, they’re far more likely to address the situation alone using in-platform features.

In fact, it may be the only time they signal to someone that something bad is happening to them.

This underscores the critical need for platforms to design safety tools that are accessible — built to support kids as they attempt to report potential danger and protect themselves.
And who better to provide input on those features than youth themselves? After all, nobody understands what kids experience online like they do.
That’s why Thorn asked young members of our NoFiltr Youth Innovation Council, aged 13 to 17, for their thoughts on platform safety tools. The teens spoke candidly about why and how they use the tools, their pain points, and how platforms can improve these crucial features to better serve kids navigating risky situations.

What young people think of safety tools

In our conversations with youth a few things became clear: Young people rely on safety tools for all kinds of reasons, but at the same time, they’re also skeptical that the tools actually work. Here’s what they told us:

They use tools to avoid harmful interactions

Young people use safety tools primarily to remove distressing content or users from their online spaces. They say doing so allows them to regain a sense of security or even simply peace of mind. They may block or report unwanted sexual advances, harassment, or triggering and offensive content:

  • “An account that posts explicit and sexual content followed me and I immediately blocked and restricted them.”
  • “There was an account leaving racist comments on my TikTok video so I blocked them. I didn’t think twice.”
  • “On social platforms, predator bot accounts can message you asking to click on links or send images and I removed the account from my DMs.”
  • “I think it was healthier for me to block them, as it gave me a break and reassurance that I would not get triggered by their content again.”

Blocking is the preferred tool

When it comes to choosing which tool to use, most youth say they prefer to block, rather than report or mute. Blocking feels clearer and more immediate, and it creates a barrier between the child and the harmful content or behavior.

  • “I personally prefer blocking and I don't use it very often but I like it because it's easier to use and I know what blocking does.”
  • “Blocking made me feel like I had space with someone and it made me feel better since I knew I wouldn’t have any more contact with the person.”
  • “I prefer blocking rather than reporting because I know I can always undo that decision if the time changes. I really have not been in many instances where the reporting tool is necessary.”

Reporting can be confusing

Confusion over the tools, especially the reporting tool, bubbled up repeatedly. This lack of understanding discouraged many youth from using the tools at all. Some youth worried they’d be de-platformed or experience retaliation from the aggressor. Others felt the action was futile because they could simply be recontacted through a new account or on another platform.

  • “Sometimes comments on Instagram are harmful and rude. Oftentimes I have to just look past it because I don't know what reporting it will do to my account.”
  • “I don't quite understand what will happen after using the report action on any social media platforms.”

The tools don’t seem all that effective

Youth say they’ve experienced shortcomings, slow responses from platforms, and recontacts that make using the tools feel futile. This skepticism is backed up by data — Thorn’s research found that after blocking or reporting threatening users, a staggering 50% of minors were recontacted.

  • “I wasn’t able to block every racist comment I got because it was a viral video.”
  • “I think Instagram has made it a lot easier but at the same time, because of that, I also feel that it is ineffective.”
  • “I honestly don't know how helpful the tool is after I use it, but I still try to use it if I need to.”

Simplicity is key

Youth told us simplicity, ease of use, anonymity, and tangible results gave them confidence in the tools.

  • “stopped the account from connecting with me.”
  • “There were more follow up questions than I expected, but they are definitely necessary.”
  • “It’s just a button to press.”

How to improve safety tools, according to teens

We asked the young council members to put themselves in the designer’s seat. What changes would they make to these tools to ensure the best experiences possible?

Make them easier to find

Make ‘em obvious and easy to locate, youth say. Additionally, they add, integrate the tools in places where they might be especially needed, such as within other flows. After all, any tiny friction — like having to hunt for a tool — can detour a child from using it.

  • “Make the report buttons in bold letters so they stand out to youth.”
  • “Make it easier for users to block in messaging. Ask if the user wants to block if messages become nsfw [not safe for work] or rude.”
  • Explain how to use the tools — earlier on
  • Kids say trying to figure out how to use a safety tool — and whether they even should — only adds to their anxiety. That’s why they suggest educating on the availability and use of safety tools earlier on in the platform experience — and doing so in simple terms that kids understand.
  • “Include information about reporting in the terms and services before signing up for the app.”
  • “Sometimes the wording of the block/restrict process is daunting, making it feel like blocking or reporting someone is a big deal, therefore deterring us from using it.”
  • “I think they are pretty easy to find but too long and complex to understand.”

Offer more blocking options

To address the issue of being recontacted through new accounts, youth recommend providing more comprehensive blocking options. Platforms can also suggest further protective measures, such as turning off the ability for a profile to be “recommended.”

  • “I think that there needs to be an option to block all other accounts that they can create later as well.”
  • “Make another option to block all accounts that they make.”

Provide status reports and updates

It’s crucial, youth say, to provide confirmation that their action went through — that the account was blocked or their report was received. It’s also important to keep users updated so that they feel supported. When youth feel their submission was ignored, they’re less likely to take action in the future.

  • “Update me whether the account has been removed, and in a faster manner. It usually takes a while for the platform to take action.”
  • “The best advice I can think of is to follow up on the situation afterwards to help the youth feel supported.”

Bring in youth perspectives

Many young people want to participate in creating a safer internet. The NoFiltr’s youth members join because they’re passionate about the digital issues kids face every day, and seek to elevate youth perspectives in these conversations.

  • “I think getting youth perspectives and getting them involved in the questions and process [is key] — because we have lived through these pieces.”
  • “Involve youth with surveys, councils, or opportunities! Youth will jump at the chance!”

Bringing youth voices into the design process takes the guesswork out of creating features built to protect them. At the end of the day, children shouldn’t carry the burden of recognizing threats and defending themselves — it’s up to all of us to build a robust child safety ecosystem. But when kids do encounter risky situations on platforms, safety tools should be right there, ready to help them report danger and remove themselves from harm as quickly and easily as possible.

You've successfully subscribed to Safer: Proactive Solution for CSE and CSAM Detection!