Safe Space: Talking Trust & Safety with Yoel Roth


In this candid conversation, Yoel Roth, VP at Match Group and Twitter alum, shares his personal and professional evolution in Trust & Safety. Host John Starr, Thorn’s VP of Strategic Impact, asks Roth to reflect on his interests in the early internet days and lessons learned while shaping the safety landscape of today's biggest platforms. Discover how a curious grad student became a pivotal figure in online protection, and learn why Roth believes we need to shift from reactive moderation to proactive safety by design.

Transcript

John Starr 0:00
I am super excited to be joined by my friend and the current VP of trust and safety at Match Group. Yoel Roth, we are here for trust con. Are you excited?

Yoel Roth 0:15
Yes, it is the event of the year every year for three years running.

John Starr 0:20
It's true. How many sessions are you speaking at this year?

Yoel Roth 0:23
Just two. But I feel like at trustcon, the most important thing is always the hallway track. And so I feel like there's, you know, an hour or two where you're on a panel or whatever, and then many, many hours of reconnecting with colleagues and friends and catching up after a year.

John Starr 0:38
For sure. So as we mentioned, we're here at trustcon, and we're actually experimenting with some video content that we're bringing out to all of you. We're here, of course, to talk trust and safety. We're here to talk about it at a slightly different angle. I'm not intending to go in the weeds on any sort of trust and safety outcomes or go back and forth on the merits of new regulation. I'm genuinely interested in talking to people who make up the space, who helped form the space, their stories and their journeys into trust and safety. I think that it's one of the most interesting facets about the space, and we have maybe one of, if not the most interesting person, people in Trust and Safety right now with us today. So we're going to talk to you, well about his journey into the space. We're going to talk to him about what he's learned along the way and look whether or not you're like a seasoned pro, or you're new to the space, or you're interested in trust and safety and trying to get into the space. I hope that you'll find this interesting. So you're well, what? So let's start with where you are now at Match Group. What does a VP of trust and see, like, what does your job look like there? Those listening may not fully understand the complexity, especially of your organization. Give us a little sense of like your day to day and your role.

Yoel Roth 2:15
Match Group builds dating apps. Our mission is to help people in the world connect with each other safely and authentically and find love and relationships and whatever it else, whatever else it is that people are looking for. And we are the parent company of some of the biggest apps in the dating space, including Tinder and hinge and dozens of other services worldwide. And so that's the first thing folks should know about my role at Match Group, which is, it's not one company or one app, it's 40 different apps, and each of those apps works a little bit differently. Each of those brands approaches trust and safety questions in slightly different ways, and sitting where I do at the central level, our job is really to help coordinate all of the work happening across Match Group on safety and integrity issues, and make sure that we're all building in a positive direction, that we're taking the lessons from each of our brands and socializing that portfolio wide, and that we're continually pushing forward the state of the art of trust and safety across every one of our 40 apps and services. Easy job. You know, I've been in the role about six months now, and I'll say every day I'm learning something new.

John Starr 3:31
Yeah. You know that that actually is a really good segue. So, from a very well known Trust and Safety Pro to maybe one of the most well known Twitter alum, so Sinead McSweeney, shout out to Sinead if she's if she's watching She did a TED Talk where she talked about her life, you know, growing up, and how people always would ask her, you know, what do you want to do when you grow up? And she really posited this idea of, like, Twitter didn't exist when she was growing up, the role that she had didn't exist. And that's very true for a lot of people in the trust and safety field. Take us way back. What did you want to be when he was growing up?

Yoel Roth 4:21
I wanted to be a paleontologist, because I just love Jurassic Park, yeah, but, you know, I grew up when the internet was first becoming a really mainstream phenomenon. I remember when we got our first dial up modem at home. I remember when we got our first broadband access at home, and I remember, as a middle schooler and a teenager, the way that the internet helped me connect with people in the world who shared my interests, whether it was arguing about politics or video games or I played the cello. I. I found communities of folks who shared those interests, and it felt magical from the very beginning, from the weird moments of AOL chat rooms and ICQ and message boards and Live Journal, there was something about that ability to connect with people that felt that was transformative for me, and that I felt was going to be one of the defining features of my life, and I didn't know exactly what form that would take, but, you know, especially when I was in high school, I was starting to figure out my sexuality. It was something that was so important for me and for how I developed that I realized I wanted to spend my life trying to make sure as many other people could have those positive, transformative experiences of technology as I did, and that took a lot of different forms over the course of my career, but I feel like the unifying thread is the internet is magical. How can I preserve as much of that magic and push back on as much of the dark side as possible?

John Starr 6:06
I think that's really interesting. So let's maybe take a kind of a beat here. So you went from that, that sort of Revelation, if you will, or that kind of acknowledgement, to then, then you really got your like start in this space in academia. Can you talk about what that was like and also like, was the field of trust and safety formed then, or was it something that we were still struggling to like, find the vocabulary? Certainly,

Yoel Roth 6:39
I was struggling to find the vocabulary for it. You know, I went into academia because you could say it's the family business. Nearly everybody in my family is a doctor Roth. And so when I graduated from college, it was sort of a question of, well, are you getting a PhD, or are you going to law school? And I made the foolish decision to get a PhD and studied what I was interested in, which was the internet. When I was in college, something crazy happened, which was when the iPhone came out, and I was an early adopter. I got one on day one, and to me, the iPhone again, felt like one of these, like sea change moments for technology, and I got really interested in what the rules were for the iPhone. And specifically, I kept seeing these moments where some things were or weren't allowed on the App Store, and I kept asking the question, Well, why is this the case? And I read a biography by Walter Isaacson of Steve Jobs, in which he recounts some conversations he had with Steve about Steve Jobs's ideas about the App Store. And he said, You know, I this being Steve Jobs. I don't like porn, and so I want to give people on the iPhone freedom from porn, and therefore the app store doesn't allow porn. And I was just struck by the absurdity of that right not feel however you want to feel about porn, but this device, it's in my pocket, your pocket. Billions of people have smartphones. Every one of those billions of people are affected by a decision that was made because Steve Jobs didn't like porn, and the legitimacy of that choice and the system of governance underlying it struck me as something worth studying, and so I worked on a PhD about that very

John Starr 8:34
Cool. So you worked on your PhD. You defended your dissertation, and then you went to Twitter.

Yoel Roth 8:45
My first time at Twitter was actually while I was still working on my dissertation. Okay? And as folks who have come from an academic background will know, everybody who's on the journey of writing a dissertation, at some point hates it, and I reached that point of absolutely hating my dissertation, and so I thought, what can I do for a summer that isn't writing my dissertation, and I don't recommend this as a career move, but I applied for one internship at Twitter, which was a platform I really liked, and for whatever crazy reason, the People at Twitter decided to hire me now, was this? Was this the time I missed your interview? Or, yes, one of my interviewers was John, who totally missed my interview, and now my proudest moment, you know, it happens, yeah, but I had the opportunity to move to San Francisco and spend three months at Twitter just researching safety and working with the team that I learned was called trust and safety. I didn't I didn't really know what this looked like from within a company before, but I got to meet some of the people doing this work in the earliest days of the company. I got to participate in their discussions and their debates. I got to do content moderation. Hands on and sort of feel emotionally what it's like to have to make these decisions. Yeah, and at the end of my summer, I realized there is nothing in the world I wanted to do more. Wow,

John Starr 10:10
What? So can you? Can you maybe give a little bit of color? Because I I totally remember that time, and like putting aside the fact that I totally whiffed. I was doing something really important. I'm sure, I'm sure. And like, you were very single handedly dealing with ISIS or something single handedly, yeah, and so I'm literally DMing you like, apologizing and hoping to get on another call with you, but putting aside that give like, I think you, I think you did a really, a really, really great job of talking about like, a general sense of like, it sounds like a paraphrasing a bit like you. You essentially like, saw what your future could be like with the humans and with the problems and the intersections. What was like, was there a moment during that internship where you're like, oh, yeah, I found like, like, was there a moment? Can you, can you give us a little bit of texture?

Yoel Roth 11:11
So every Friday at Twitter, we had something called Tea Time, yes, and it typically did not involve tea. It usually involved wine and beer, but the idea was the whole company would get together, executives would talk about strategy and what we're working on, and we'd all hang out and have a good time. And I remember that at one point, while I was an intern at Twitter, there was a user protest. I forget what exactly our users were mad at us about it, definitely related to our handling of safety issues. And I remember thinking, You know what, if I download all of these tweets and classify them and just start saying, like, it's not a user protest, it is a protest about this specific thing. And so I developed a content classification taxonomy, and I kind of broke down the 1000s of tweets that were part of this protest, and did what grad school had trained me to do, which is produce data analysis. And I said, you know, here are the top 20 themes in this protest conversation. And what was really surprising was that it wasn't people who were just kind of nebulously upset. It wasn't even mostly about our policies. It was about the product. It was like, we want the block feature to do this and today it doesn't do that, or we want search to work this way and not that way. And I didn't know if anybody would care about this analysis. I sent it to my boss and was just like, hey, like, here's a spreadsheet. Maybe this is interesting to you. And then I'm sitting at tea time that Friday, and Dick Costello, who is the CEO of the company at the time, says our trust and safety team just did this incredible analysis of user discussion about what people want our safety features to be and we're going to do that. And I was just flabbergasted. There was this analysis that I had done in a couple of days that had made its way to the CEO of the company, and there was this CEO saying, we're going to do what our users are telling us, because one random dude in trust and safety, listened to users and wrote it down, very cool. And I went back to graduate school, and then over the next few months, I watched feature after feature after feature come out at Twitter that were the things on my spreadsheet.

John Starr 13:35
So that's really great, and that's really that's really insightful to, I think, the culture that was there then and and kind of the experience that I think a lot of people had there. So you go back to school and then you come back to Twitter, you we could, we could spend a day in here talking about that, give us what was that like for you at Twitter? I know it's probably speaking. So full disclosure, you and I worked together at Twitter.

Yoel Roth 14:05
I learned everything I know about trust and safety from John.

John Starr 14:10
And so one of the things that I think is true about Twitter is you have kind of different chapters there, as you did as well. Try to try to give us a little bit of a flavor of your time at Twitter. What did you learn, and what do you know, just kind of, what were some big kind of takeaways that that kind of stay with you from there?

Yoel Roth 14:37
You know, one of the defining features of Twitter culturally was that it was a company very willing to give smart people the space to explore what they thought was impactful or interesting. And I was really fortunate to have leaders and mentors at Twitter like you and like dal Harvey, who would listen. And I would come to you with something that I thought was a problem, and then say, Okay, go explore that. For example, when I started at Twitter, I got really interested in data licensing and the Twitter API, which was this, like, total niche area that nobody was really paying attention to. And, like, I thought it was kind of cool and kooky, and so I was like, what if I paid a little bit of attention to this in like 20% of my time? And so I started to build out some policies, and started to build out some tooling. And then about a year later, Cambridge Analytica happened, and all of a sudden, everybody in the tech industry is thinking about APIs and data privacy issues. And Twitter was ahead of that, because the company gave me a little bit of space to just follow this issue I happen to be interested in, and that was it was such a defining thing at the company that at every juncture in my journey, and I think the journey of lots of other people, we had limitless opportunity for what we could work on, and the freedom and the flexibility to explore those issues. And so one of the takeaways for me was, don't just do the things that you already believe are going to be the most important or the most significant, or the things everybody wants to work on. If you're interested in some niche issue, go work on it, explore it, see if there's something to it, because you don't know what's going to be the issue a year or two years from now, and it might well be that niche issue that you worked on today. The sort of other kind of big defining moment for me at Twitter was realizing just how eclectic and diverse the service is in my academic life, the fancy word for this was polysemic. Services are what people make of them, and Twitter was really defined by that so many of the things that make Twitter what it is, from the hashtag to [at] mentions and replies, those were all user derived innovations. They were things that the community on Twitter built, and then the company helped kind of implement them in the product in certain ways. And I always thought that was really magical, right? It was not that this product was our top down vision. It was more like we were stewards of this thing on behalf of the community, and we were building something that would respect what the voices on Twitter wanted, and the diversity of those voices, the different ways people were using our product, from music to sports to TV to politics to anime discussions, all of that really taught me to have a focus on not just what most people are doing, but also what are marginalized folks doing, what are the non dominant voices doing, and how can we build a product that serves them as well? Very cool.

John Starr 17:50
And so just to maybe tie a bow on your journey, tell me a little bit, share a little bit about, like, I when I heard you went to Match Group, it wasn't super surprising to me, especially based on, we didn't talk about what you did your dissertation on, or what kind of interest you. But like, it wasn't shocking to me. Can you maybe give some color on kind of why you said yes to them?

Yoel Roth 18:18
I've found dating apps fascinating for my entire life. I can count the number of dates I've been on in my life that I didn't meet online. On one hand, I met my husband on an app. And I think dating and love and romance and relationships are a foundational and universal part of the human experience, and today, a lot of that experience plays out online. For LGBTQ folks, more than half of relationships form online. Most people have at least tried online dating. And so I think there's a ton of opportunity there to build products that not only are ubiquitous and quite widely used, but that also help make people happy. You know, I really loved the work that we did at Twitter to give people a space to debate politics and to have civically important conversations. And now I have the opportunity to work on products that help people find love. And I think both of those are important, but they're different facets of the human experience, and so I like getting to work on both of them.

John Starr 19:25
What's what's like? Is there a piece of advice that you would give a young academic looking to explore or break into the trust and safety field

Yoel Roth 19:41
work faster and work publicly. Tell me more about this. Academia is, for lots of very good reasons, all about the slow and methodical production of knowledge. It's about producing work that will stand the test of time, that you'll write an article in 50 years from now. How a class of students will read your article and say, Damn, that's really smart, and that's important, but that's not the thing that's necessarily going to drive impact in the real world. If you care about shaping what technology looks like, if you care about influencing policy discussions, if you want to make the products you use better, think faster and think scrappier, right? Publish your work now. Publish it open access. Seek opportunities to publish in journals with rapid turnaround times, and then really think about how you can share your work publicly as you're doing it. Be willing to put your rough drafts out there in public and get feedback on them. Feedback is hard, but approach it with humility and openness and a growth mindset, and you can have way more impact by just doing your work out there in public in the world, rather than saying, I can only put my work out there after it's gone through three years of peer review, really

John Starr 21:01
powerful work faster and scrappier. Let's, let's maybe zoom out a little bit and talk about the space a little bit. So what, from your perspective, what is the most misunderstood part of trust and safety, or of what it's like to be a Trust and Safety Pro, what do you wish more people knew about us?

Yoel Roth 21:31
I think the biggest misconception about trust and safety is that it's just censorship or it's just reactively cleaning up bad stuff after it happens. I think one of the most interesting developments in the trust and safety field, from when I first started studying it, 15 years ago to now, is that we're not just the janitors of the internet. We're thinking about how to build products that are resilient to abuse. Upfront, Julie Inman grant, who's the Safety Commissioner in Australia, and a former colleague of ours at Twitter, has for years now, been talking about safety by design as a core thing trust and safety needs to be doing. It's about making it so that your products are more resilient to misuse and are safer through the very fundamentals of how the product is built, and I think that's incredibly important. It's not just about dealing with harmful behavior once it happens. It's about making products that encourage civility and respect and kindness and authenticity. And I think there's ways you can build a product that does that, and that's such a powerful shift in the way that we think about trust and safety. It's something we were on the leading edge of at Twitter. We built one of the first teams that did this shout out to product trust. And it's, I think it's really where impact in this space is going to be going forward. Awesome. So

John Starr 22:57
if you're in trust and safety, you know that it is not a perfect science. It's a lot of continuous-improvement engine and always striving to be better. What, from your perspective, what's a part of the space that we could be 5% 8% better in?

Yoel Roth 23:32
One of the most important things we can do as a field is being more willing to share with each other what we know, what we've learned and what hasn't worked. I think a lot of the expertise in the trust and safety field stays locked behind the walls of NDAs and within big companies. The amount of expertise that exists at Meta, at Google, YouTube, Reddit, Pinterest, Snapchat, is immense. We could solve all of the world's problems if we could figure out a way to pool our knowledge and our experience and our tooling and what we've built, and spaces like the TSPA and trustcon are starting to build more of that community. But I think the direction of travel here is not going to be one company just getting incrementally better and better. It's going to be all of the professionals working in this field bringing their 5% improvement together with everybody else's 5% improvement. And then we can really start to chip away.

Unknown Speaker 24:34
What does the future of trust and safety look like?

Yoel Roth 24:39
I'll give you two answers to that. Okay? The first one is safety by design. I can't stress enough the way that we have to start thinking about trust and safety work as not being moderation after the fact, but design before the fact. We've got to be building a culture of trust and safety within product teams. We've got to have every end. Engineer working on product development, thinking about safety and misuse. We've got to be red teaming products before we launch them, so that we can think about adversarial use ahead of time. And I think that's one of the most important things we're seeing companies build out today, and I really see that as the future of the field. The second piece of it is something that I've been doing as a side project since I left Twitter, which is helping to build out a hub for open source Trust and Safety tooling. One of the intuitions that I've had, and that a few other folks in the field have had, is that the technology that we build, from hashing and matching systems to Rules Engines is locked away within specific companies, and that each of us who move between companies end up building the same things over and over and over and over again. What it would what would it look like if we broke that cycle? What would it look like if we built a hasher matcher and we open sourced it, which is what our colleagues at Meta did. What would it look like if we built a rules engine like smite or bot, maker of tools that we had at Twitter, and we published the code on GitHub so anybody could use it? And it's that motivation that's leading myself, Camille Francois, who's a professor at Columbia, Dave Wilner, Juliet Chen, and a group of us, to start to build out a hub for accelerating this type of open source development. We think that this can accelerate work at the big companies, and then critically, we think it can accelerate work at small companies just entering the space, so that the entry costs aren't so high. If you want to build an amazing new social product, you should be able to use free and open source tools that build a safe experience from day one without needing to spend years and years building trust and safety infrastructure first. And so I'm really excited about that as a future direction for the field.

John Starr 26:53
Very cool. Just to double click a little bit on the safety by design component. You know, obviously, Thorn is a nonprofit. We are focused very clearly on a, I would say it's a very small but important part of the trust and safety kind of, you know, landscape of harms. How does a company like Match Group or Twitter? How do you think about staffing, or having the kind of expertise to certainly like, think about concepts like safety by design with a child's perspective in mind, or a teenager's perspective in mind, or a young person's perspective in mind in the context of safety by design, or kind of in general. How do you, how do you think about like capturing these very, you know, unique subject matter experts and tying them into your general process.

Yoel Roth 27:48
Yeah. I mean, the most important thing you can do is talk early and talk often to a wide range of experts and voices in the field, even a company 10 times the size of Match Group, 10 times the size of Twitter, hell, 10 times the size of Meta is never going to have every perspective represented internally. You can do your best to hire diverse teams. You should, but you're never going to have a comprehensive perspective. And so partnership with outside groups, including Thorn, who bring that wide range of intersectional expertise is really, really critical. A Match Group. We have a Trust and Safety Advisory Council, which Thorn is a part of, where we talk about the policies we're developing, the products that we're building, and we bring some of the big, sticky questions about what we're building to the group, not to, you know, just check a box and tell you, here's what we're doing, but to actually seek feedback and perspectives and questions. And I would really encourage every Trust and Safety practitioner to seek those opportunities, to find partners in spaces where you have blind spots and to really, again, don't treat it as a PR exercise. Don't treat it as box checking. Treat it as a chance to approach these questions with humility and with a growth mindset.

John Starr 29:09
It's been such a pleasure talking to you, man, it's been a long time. I can't thank you enough for joining us, and I hope you have a really good rest of the week at trustcon.

Yoel Roth 29:18
Thanks for the invite. That's awesome. Awesome. Bye.