Post By: Safer / 2 min read
By John Starr, VP of Strategic Impact at Thorn, and Aaron Rodericks, Head of Trust & Safety at Bluesky
In the past few weeks, Bluesky saw more than 7 million new users flood its platform - a surge that would test any platform's safety practices and infrastructure. This influx highlights the demand for a platform free of spam, scams, and abuse, and brings an important reminder: safety by design is the foundation for responsible growth in the digital age. Bluesky is building to make that kind of platform a new option for consumers.
With Bluesky's wave of new users came the predictable challenge of an uptick in harmful content posted on the network. The reality is that child sexual abuse material (CSAM) – and other threats to kids – can thrive anywhere there is an upload button. This problem is not unique to any single platform, but platforms that integrate trust and safety practices early, adopting cutting-edge tools and proactive strategies, are better equipped to address these risks head-on.
Because trust and safety has been a priority for Bluesky from the start – the platform has been using Thorn’s Safer since August 2023 alongside other content moderation tools that address a range of harms – it is well positioned to not only handle the influx of users, but to keep its userbase safer and protected at any size.
Using Thorn’s Safer, Bluesky is already able to detect CSAM with remarkable accuracy. While human review of harmful content is a necessary step to ensuring child safety, Bluesky’s proactive approach and use of Thorn’s tools reduces the burden on human moderators by automating parts of the detection process – while still adding expert human oversight to make nuanced decisions requiring judgment and context.
While no platform can completely eliminate harmful content, the combination of proactive technology and a dedicated trust and safety team can mitigate these risks significantly.
Safer flags potential child sexual abuse material at a scale that would be impossible to manage manually. This ensures that Bluesky’s growing user base can engage on the platform while being protected from some of the most egregious and harmful types of content on the web.
What other platforms can learn from Bluesky’s rapid growth
Bluesky’s experience offers a roadmap for other platforms navigating periods of rapid growth:
- Anticipate risks: Platforms must recognize that harmful content will seek to scale alongside user numbers and plan accordingly.
- Invest in the right tools: Advanced AI paired with careful human and trust & safety team review – is essential for effective content moderation at scale.
- Invest in safety by design: Safety should never be an afterthought –and user safety is always worth the investment. Platforms that prioritize trust and safety earn user trust and loyalty. By prioritizing safety at the design phase, platforms of all types not only better protect users —including children—but also lead the charge in ethical innovation.
Looking ahead
For Bluesky, this moment is just the beginning. By investing in cutting-edge solutions and building a culture of safety, it has set a new standard for platforms navigating the complexities of rapid user growth.
As more users migrate to new platforms and the internet becomes increasingly decentralized, the stakes for content moderation will continue to rise. Collaboration and use of scalable tools and technologies will be key to maintaining stronger, safer social networks, and are becoming a product differentiator for users.