Sep 3 2024

Safe Space: Talking Trust & Safety with Patricia Cartes

Post By: Safer / 23 min read

Learn
Blog

From aspiring United Nations interpreter to Trust & Safety leader, Patricia Cartes, Head of Trust & Safety at Cantina AI, takes us on her global journey into online safety. Thorn's VP of Strategic Impact, John Starr, explores Cartes' journey from translating spam websites to shaping content policies at tech giants.

Patricia reflects on the early days of T&S under "online sales and operations" to its current status as a flourishing field critical to business success. She offers a robust perspective on balancing free speech with user protection, drawing from her extensive experience in public policy and global perspectives. Her vision for the future of the field is full of determined hope, where growing professionalization and dynamic cross-industry collaboration pave the way to diligently address complex online safety issues such as child sexual exploitation.

Transcript

John Starr (00:00):
I'm here with my friend and head of Trust & Safety at Cantina AI, Patricia Cartes. Patricia, thank you for joining us. It's TrustCon week. Are you excited?

Patricia Cartes (00:12):
So excited. This is the best week of the year.

John Starr (00:16):
Why?

Patricia Cartes (00:17):
Because we all come together, all of our friends. Please don't tell my kids or my husband that this is better than their birthdays.

John Starr (00:23):
I won't. I won't.

Patricia Cartes (00:24):
But yeah, it's like we come together, and I feel so energized. I think all year, I live for this conference, and it reminds me why we work on this field, that there's a path forward, that I'm not alone, which I forget.

John Starr (00:39):
You are not alone.

Patricia Cartes (00:39):
Yeah.

John Starr (00:40):
Yes.

Patricia Cartes (00:40):
And so, yeah, couldn't be more excited for the week.

John Starr (00:43):
Well, I appreciate you taking some time out of your week to join us and chat with me. So we're going to be talking Trust & Safety, of course, but more at a human level. We're not going to debate the merits of any sort of regulation or talk about any specific outcomes. I'm really curious to talk journeys and pathways to Trust & Safety. So many people we've worked with over the last decade plus have had just very interesting journeys into the space.
(01:13):
I think it's one of the most interesting parts about Trust & Safety, and what makes this week so exciting and interesting as you noted. And so would love to spend some time talking about your journey and lessons along the way. You have had an incredible career, and I think are going to have interesting perspectives for us to chat. The goal is whether or not you're like a seasoned Trust & Safety pro, or maybe you're just trying to get into the space, is that you find this interesting and helpful. So what I love to do is maybe, can you tell us a little bit about what your role is now?
(01:58):
Give like a three-minute commercial on what the day in the life looks like, and then we're going to go backwards a little bit, and because you have a really interesting Trust & Safety journey from ... It's got public policy, it's got consulting, and I think it's going to be really interesting. So tell us a little bit about what you do now.

Patricia Cartes (02:21):
Yeah. I do everything under the umbrella of Trust & Safety at Cantina AI, which is a social AI platform, and that means that humans can interact with each other, and they can also interact with bots. And so you, as you can imagine, you have the challenges of human interactions, which by nature are difficult, whether online or offline, but then you also have these bots that are hyper-realistic, and that are becoming a part of our lives. And my role is to think through potential risks and to try to mitigate them. It's impossible to prevent every potential risk, but you can try to mitigate. I think that's our job.
(03:18):
And so on a normal day, I might be drafting policy in the morning, I might be engaging with a regulator in the afternoon about what we do with training data on AI models or bot prompts that might be harmful or illegal. And maybe in the evening, if I'm unlucky, I'm dealing with an escalation because they come at all hours.
(03:44):
And we are still invite-only, so it's really incredible. I think it speaks volumes of the complexity of Trust & Safety, that even before you launch a product to the general public, you can still be as busy as I promise you I am.

John Starr (04:03):
I believe it. I believe it. I believe it. And obviously, it sounds like a very interesting spot and position that you're in right now, and so I want to get a little bit into how you got there. It's a really big job.
(04:21):
And I'm sure you've learned a lot along the way. And so one of the things that I want to anchor us is we have a former colleague,,. Shout out to Sinéad, and she did a TED Talk. And in her TED Talk, she really, among other things, captures this kind of idea of like we were all asked as kids, "What do you want to be when you grow up?" Right?
(04:48):
And she kind of turns that on its head a little bit because she's like, "Well, the job I have didn't exist when she was a young lady. And so take us all the way back a little bit.

Patricia Cartes (05:00):
Yes.

John Starr (05:01):
What did Patricia want to be when she was growing up?

Patricia Cartes (05:04):
Yes. A translator.

John Starr (05:05):
Yeah.

Patricia Cartes (05:06):
A translator, and more specifically, an interpreter, which is a simultaneous translator in the UN. And that was-

John Starr (05:15):
I'm not surprised by this. Go on.

Patricia Cartes (05:17):
That was my dream.

John Starr (05:17):
Go on. Tell us more.

Patricia Cartes (05:18):
At age 10, I told my parents I want to be a translator, and I want to work in the UN with people from different countries, different cultures, facing different political issues, and I want to be the conduit to communication. I'm sure I wasn't as articulate at that age, but to my surprise, my parents took me seriously. They were both academics. Maybe that's why they took me seriously, and they said, "Great, let's get you to go to England, and Ireland, and Canada, and France, and you're going to learn English and French, and then we'll get you into college so that you can be in that path." But unbeknownst to them, one of the countries where they sent me was Ireland, and Ireland was living a technological transformation at the time.
(06:16):
The Irish government was very smart in investing in tech companies in the early 2000's, and a company called Google in 2006 when I was not even out of college, saw my resume and contacted me about a web spam job. And the main reason for that was that I spoke three languages, and they needed people that could translate the websites and really understand the spam. And as with anything, Trust & Safety, there are cultural nuances.

John Starr (06:53):
Sure.

Patricia Cartes (06:54):
So I got started in-

John Starr (06:56):
So, yeah. So let's maybe kind of dig in a little here, because did you see that opportunity to achieve that UN perspective through this, or how are you viewing this? Were you viewing this as like a pathway to that ultimate place where you wanted to get?

Patricia Cartes (07:17):
Yes, because not to bore you with the world of translation, but to become an UN interpreter, you ... I mean, I'm sure now, this has changed, but for the most part, you had to do a Master's in either Geneva or Paris, and it was really hard to get into those Master's. It required many years of experience in the translation field, and me having fallen in love with Ireland, I thought, "Well, if I work in Ireland for a few years, I'll achieve the level of proficiency that's required to take the tests to go into those universities." And so it was a path, and, really, I was really passionate about translation. I was also thinking about academia as a potential path for me. I was very into James Joyce, Ulysses.

John Starr (08:09):
Yup.

Patricia Cartes (08:10):
And when I started working at Google in web spam, which, again, was like, "I'm going to translate some websites, I'm going to assess them for spam," I still didn't know Trust & Safety, what it entailed.

John Starr (08:25):
Okay. Yup.

Patricia Cartes (08:25):
Trust & Safety was not a term that was used.

John Starr (08:29):
Yup.

Patricia Cartes (08:29):
We were under online sales and operations. There was not-

John Starr (08:33):
Sales.

Patricia Cartes (08:34):
Yes, online sales and operations is the org.

John Starr (08:35):
Okay.

Patricia Cartes (08:36):
There was not such a thing as a Trust & Safety org. We were also called Search Quality, and we would see attempts to manipulate the search results pages. So I'm going to hide some keywords and I'm going to rank higher. Now, that's what got me into Safety.

John Starr (08:55):
Okay.

Patricia Cartes (08:56):
It was the breaking of the rules to achieve a result through means that were not legitimate, and I have this distinctive memory of realizing spam.

John Starr (09:10):
Yeah, tell me.

Patricia Cartes (09:11):
Like fighting spam, fighting harmful uses of technology. This is what I want to do, and I can't picture a life where I wouldn't do this.

John Starr (09:22):
And this was at Google?

Patricia Cartes (09:25):
Yes.

John Starr (09:26):
And so at Google, your goals and your view of what was for you in the future evolved?

Patricia Cartes (09:34):
Yes. That's right.

John Starr (09:35):
Okay. Okay. And so at what point, because I know after Google, I think you went to Facebook.

Patricia Cartes (09:41):
Yes.

John Starr (09:41):
At what point did the vocabulary maybe, or a version of the vocabulary that we use today, become clearer to you, and you're like, "Oh, this is something, Trust & Safety." Walk me through that.

Patricia Cartes (09:56):
I think it was 2009, I've been very lucky to be at the right place at the right time, and Facebook contacted me in 2008. They were about to open up the headquarters in Dublin, and they wanted somebody to work in user operations. So still not Trust & Safety, but user operations, and user operations was looking at abuse reports filed by users. They were also doing other things like security, like, "I lost my password. How do I reset it?"
(10:29):
And that was the first, I think notion of Trust & Safety because the community guidelines, which are the rules governing the site, beyond the terms of use, which were very legalistic, we were starting to assess content versus the community guidelines, and those community guidelines were evolving almost on a daily basis. I don't even think they were public when I joined in February 2009, and so it was a time of deep philosophical conversations, but also, most importantly, what I was hoping to do was bring the European perspective.

John Starr (11:08):
Yup.

Patricia Cartes (11:09):
It was a very U.S.-centric work, and Europe was in a very different place. I mean, a lot of content is illegal in Europe that is not illegal in the U.S., and so my role was to, I think I was 24 years old, but I would be like, "Hey, so I assess this report based on the community guidelines. I shouldn't take action, however, I think we should maybe talk to the lawyers because we might need to do something for my jurisdictions."

John Starr (11:41):
Interesting.

Patricia Cartes (11:41):
And that was safety. We were calling it safety and community standards, and I think that's the first time that I kind of like became aware that this is a field, it's rapidly evolving, and the challenges that I'm facing at Facebook, YouTube is facing as well. Twitter was starting to pop its head up, and I think they were, I'm sure starting to face similar challenges, but nobody knew who was who, unless you had worked at multiple companies.

John Starr (12:16):
Got it. So tell me a little ... So you started at Google, you explained the kind of evolution at Facebook and your kind of UN hat, if you will, chiming in on the global perspective. Is this the time when you kind of leaned into that more and continued to do a little bit, or started to do a little bit of public policy work? And how did that get introduced to you?
(12:50):
Because hearing you talk about your kind of childhood dreams or ambitions as a UN, I've seen you in those types of conversations before. So it's not surprising for me to hear that, but tell me a little bit about how that took shape.

Patricia Cartes (13:11):
Yeah, I love that question because it's such an unexpected path. I think in that year of 2009, I'm deep in the trenches of content moderation. I also become a manager for the first time. I'm bringing in content moderators to help me with France, Italy, Spain, those markets, and one thing happens. Facebook starts to get a lot of questions publicly about, "Why did you take that piece of content down? Why didn't you take that one down?"

John Starr (13:44):
Okay.

Patricia Cartes (13:45):
And the first person representing communications is hired, the first person representing public policy is hired, Lord Hallam. He's a baron of Hallam. He's still a lord. He was not a lord at the time, but Richard Allan joins, and as soon as those two hires are made and there's a channel for those incoming queries, there's a need for somebody on the Trust & Safety side, which is still not called Trust & Safety, but it is Trust & Safety, to explain what has happened with any piece of content, but also how we should explain the nuance of why we made a certain call, because at times, a call may appear to be wrong, but if you take into account the context around it, is not wrong in the moment. It's maybe the policy was falling short or ...
(14:37):
And so that became my role, is, "Give me those escalations, and I'm going to tell you ... I'm going to look under the hood, and I'm going to tell you, 'Here's the action that we took,' and how I would speak about this publicly." And civil society got very interested, because the European Commission was finding two programs, Insafe - Safer Internet Centers, and INHOPE - Internet Hotlines. Insafe focused mostly on objectionable content that targeted minors, INHOPE was child abuse material.

John Starr (15:11):
Sure.

Patricia Cartes (15:12):
And so these two big networks that have centers in each member state of the EU, it's like player number two has entered the room. They entered the room, and they were like, "Okay, so why did you take that down? Can we work with you? Can we give you more context so that you might not make a..."

John Starr (15:31):
Yeah. You were a translator.

Patricia Cartes (15:32):
Yes, exactly.

John Starr (15:33):
You were an interpreter.

Patricia Cartes (15:34):
Yes. Yes.

John Starr (15:34):
Yeah. It's full circle.

Patricia Cartes (15:36):
Yeah.

John Starr (15:37):
That's super interesting, and it makes total sense. And so when did you leave Facebook and go to Twitter?

Patricia Cartes (15:45):
I left Facebook in 2013.

John Starr (15:48):
Okay.

Patricia Cartes (15:50):
And you mentioned our colleague, Sinéad.

John Starr (15:51):
Yeah.

Patricia Cartes (15:52):
She spoke at a parliamentary here in Ireland on behalf of Twitter. I spoke on behalf of Facebook, and I remember watching her. She spoke before me and she got a lot of heat from the members of Parliament, and I thought, "Well, she's got a big challenge in her hands. How fascinating. That would be so difficult."

John Starr (16:17):
Right.

Patricia Cartes (16:18):
And I remember getting home that day and saying, "I could never work for them. That's so much work."

John Starr (16:24):
And then you started in a few weeks?

Patricia Cartes (16:26):
And then I joined them. Yes, a few weeks. I got an email that same night and-

John Starr (16:29):
Stop it.

Patricia Cartes (16:30):
Yes, I did from the human resources team.

John Starr (16:33):
Wow. Yeah.

Patricia Cartes (16:34):
And, again, I found the challenge fascinating because Twitter was a very open platform.

John Starr (16:42):
Yeah.

Patricia Cartes (16:42):
There was a lot more of open communication, less direct messages. I think at that time, you couldn't even share images on direct messages.

John Starr (16:51):
Right, yeah.

Patricia Cartes (16:52):
Because it was open communication, it was challenging. It also had been used as a tool during the Arab Spring, so it had already shown so much potential for human rights defenders and advancing democratic causes. And I thought, "Wow, with that great power comes great responsibility. I couldn't possibly be in the driving seat of that in any capacity."

John Starr (17:19):
Yeah.

Patricia Cartes (17:19):
But precisely, because it was so exciting, I couldn't really turn my head away.

John Starr (17:25):
Yeah. Was there a major takeaway that you have? Yeah, obviously, we worked together when I was at Twitter. That's where I met you. Is there kind of a takeaway from that point in your career that you have as like a big learning or just like a big moment?

Patricia Cartes (17:45):
Yes. There's a few. I think coming from Facebook, that had a very ... I think there had been a lot of internal discussions about what Facebook should be. Do we want it to be a family-friendly site?

John Starr (18:02):
Yeah.

Patricia Cartes (18:02):
And that showed in the way that we moderated content and set the rules. Twitter was very pro-free speech, and that with itself brought its own challenges. But what I take from that time is you can do free speech with all of the challenges that it brings right if you have a very dedicated team. Every person that I met during my time at Twitter was incredibly dedicated, really wanted to push the boundary on free expression, while at the same time, preventing the worst of harms. And I hope you let me borrow your phrase, but it really has inspired the rest of my career. They didn't count on us.

John Starr (18:47):
Yeah.

Patricia Cartes (18:47):
When I think of the bad actors that might think "because of free speech, I'm going to get away with this" – free speech taken to the absolutist perspectives.
(19:02):
There's no rule of law. And that team that was so dedicated, I think in my view, regardless of whether we made the right calls in terms of moderating content or building systems, there was a dedication and a passion to prevent harm that stays with me today, and that's why I keep doing this job. I just think of every person that I work with during that time, and I think we really hit a sweet spot, and we could stand really proudly for the decisions that we were making.

John Starr (19:38):
Well said. I'm curious, is there a piece of advice, if there's maybe a young woman, young man who wants to be a translator for the UN perhaps, or maybe wants to be head of Trust & Safety at an AI company, and they're just beginning their journey? What piece of advice do you have for them?

Patricia Cartes (20:08):
I would tell them to persevere. I started from the bottom. When I first joined Google, they would tell you, "You need to review X amount of hundreds of websites a day manually," and I would do double that. That was my goal for the days. I'm going to do double, and I'm going to learn all of the technicalities of spam.
(20:32):
If you're dedicated, if you don't mind rolling up your sleeves, you're going to learn a lot on the front lines, and you need to persevere because it's tough. You see content that is very challenging. You see issues that seem like you can't solve them. You're never going to solve them. It's like, "How are we ever, as a society, going to solve this?" But you're part of a collective, and there's a lot of power in that collective, and that's why this week is so exciting to me.
(21:03):
I mentioned earlier you sometimes feel like you're working alone, but I turn around here today at lunchtime and I talk to somebody, and they're facing very similar challenges. And so I would say, "Roll up your sleeves, persevere, make connections." Every company I've worked for, I've made a point of trying to connect as many people as I could, whether it's in data science, or product, or engineering. I would be this crazy person that at lunchtime, would sit at random tables, and that's hard, by the way. I know I might seem like an extrovert, but I have some introvert tendencies, and putting myself out there at a lunch table when I don't know people is really hard, but getting to know people, making those connections, and even outside of Trust & Safety, I think people are very passionate about the challenges that we face.
(22:01):
And you can harness that interest in the connections that you're making, but don't give up and don't ... Even though you'll see problems that seem unsolvable, work with others, rely on others, and persevere, because this can be your community, and I hope that you find a home in this work.
J
ohn Starr (22:24):
I love that. So you work a lot with ... You have over your career worked externally, so engaged with a number, civil society, regulators, government officials. What do you think is the biggest misconception, generally humans outside of the space? But I think it's appropriate to ask for you like, "What's the biggest misconception policymakers have of Trust & Safety?"

Patricia Cartes (22:57):
I think it really boils down to the scale of it and the complexities and the nuances of any piece of content or account in a specific setting. If I give you an example of a content moderation call I have to make, I present some facts to you, and you might say, "Yeah, I would take that down," or, "I would suspend that user." But sometimes when you take a step back and you look at the scale, and you look at the fact that, "Well, that content moderator maybe has just reviewed 500 pieces of content before that," and they have a couple of seconds to make a very complex call, or there is a geopolitical factor that is coming into play. All of those nuances are really hard to capture. And I have conversations, not just with policymakers and regulators, but even with my own family, where they'll say like, "Oh, no, you should allow that graphic artwork on the site because why wouldn't you?"
(24:04):
"It's a drawing." Right? Take [inaudible 00:24:08] which is a beautiful painting," is, "Well, so what? You should allow it." And then, I have to provide that nuance of, "But I have 13 and 14-year-olds that my encounter that piece of content, even though they didn't ask to see it in any way."
(24:24):
Somebody might share it, and it pops up on their feed. And so that nuance, I think is sometimes lost in translation. The good news is that over the last 10 years, when I think about the public policies roles I've had at Facebook and Twitter, regulators and policymakers have become very sophisticated in their understanding. And also, we have new pieces of legislation that give them more access to risk assessments and audits.

John Starr (24:55):
Yeah.

Patricia Cartes (24:56):
So I think we're in a good path, but I would say that it really boils down to the context in which our work takes place. It's sometimes impossible to articulate well.

John Starr (25:12):
Yeah. So much of Trust & Safety is being able to understand the perspectives of other humans engaging and are surrounded culturally, globally, contextually. I totally agree. So I've never met a Trust & Safety pro that is like home run perfect. I feel like it's a concept.
(25:36):
The concept of continuous improvement is something that I think has really been core to Trust & Safety. So first, with Trust & Safety specifically, what's one way you'd like to see the space get 5% better?

Patricia Cartes (25:53):
I mean, here, I'm going to talk about Thorn a little bit, because last year at TrustCon, Charlotte Willner, the executive director, who I've worked very closely with in the past, brought up Child Sexual Abuse Material. We would encounter an image, and, "Okay, we can handle it for our platform, that we're a walled garden, and how do we talk to the other platforms?," and it was very discouraging, that it felt like insurmountable legal challenges to get to share information and best practices with people in the industry.

John Starr (26:30):
Sure.

Patricia Cartes (26:31):
When I look back now, I'm in awe at the amount of collaboration that exists. Classifiers like Safer have completely transformed the industry, and it's not an exaggeration. I wish somebody had told me back in 2010, "Don't worry. There's going to be a solution, a technical solution that is going to be put together by a third-party that we trust, that has a lot of expertise that you lack from the inside." And I would say for the future of our industry, we need a lot more of that collaboration.
(27:14):
That's one example of child sexual exploitation, but what about non-consensual nudity of adults or hate speech or ... It's that more difficult because I think we can all agree about the legal status of content of that nature, and it's much trickier and much more sticky when it comes to other speech issues. Like what is considered hate speech in Spain where I grew up, I grew up in the '80s and '90s with a lot of terrorism, and we had some pretty strict laws, we still do, around what you could say publicly, that's very different from what I can say in the U.S. And so while it is difficult, I think sharing intelligence, signals, and working together, having TrustCon, the Integrity Institute, working with Civil Society, Insafe, INHOPE that I mentioned earlier, are really good examples of that. They gave us the chance to engage as industry.
(28:17):
I met people from industry, sitting at conferences that were put together by the European Commission. And I think that's how we're going to get that 5% better, is to continue to think through, "How can we collaborate?," because we can't do it alone. And also, even if we tackle a challenge in our own platforms, we're never going to eradicate that abuse at the internet level.

John Starr (28:41):
True. We talked about some of your aha moments for your career or for your path, or your journey. In Trust & Safety, we all have our, "Oh, no" moments. Was there an, "Oh, no" moment for you over the years? And you don't have to get into specifics if you don't want to, but was there an, "Oh, no" moment for you where you're like, "Oh, the game has changed," in just the kind of adversary nature of things or the evolution of products?

Patricia Cartes (29:21):
Yes.

John Starr (29:21):
I remember being at Twitter, and learning that the company had just purchased a live stream app.

Patricia Cartes (29:28):
Yes.

John Starr (29:29):
And that was like an, "Oh, you know what?," moment, for me, just having to get up to speed really quickly on kind of what that would mean. Was there one of those for you?

Patricia Cartes (29:39):
Yeah, I've had a few. I've had a few.

John Starr (29:41):
I'll bet.

Patricia Cartes (29:42):
Yeah. I mean, how much time do you have?

John Starr (29:43):
Yeah. Yeah.

Patricia Cartes (29:45):
But one that I distinctively remember is, I think you were part of meetings with the Trust & Safety Council we had at Twitter.

John Starr (29:52):
Oh, yeah. Yeah.

Patricia Cartes (29:53):
And we decided we want to bring in civil society. At the end of the day, they're the experts on the different fields of abuse, and it would be great to keep them informed on any product feature we're developing policy, et cetera, but also learn from them on a really regular basis about what they're seeing on their markets for that type of abuse. And I remember hearing about how one of our features, which was Block, was actually being misused by abusers on a domestic violence setting.

John Starr (30:27):
Yeah.

Patricia Cartes (30:28):
And that misuse, we had not anticipated.

John Starr (30:33):
Yeah.

Patricia Cartes (30:33):
And unless you were in the weeds, if you were working with law enforcement, with victims of domestic violence, you probably would've been able to anticipate it. We didn't.

John Starr (30:43):
Yup.

Patricia Cartes (30:45):
And that was a real, "Oh, no." I mean, of course, before you put out a feature, you think through, "How can this be misused?," but there's going to be some edge cases that are too far, the mirrors in the car, you can't see them, and I thought if we had engaged the Trust & Safety Council earlier, perhaps we could have prevented that type of misuse. And there were victims already, and we were able to fix it really quickly, but there were already victims. And one victim is one too many, so that was a real moment of, "Okay, we need to do better." This is not just Twitter, or Facebook, or Google, this impacts anybody who's in those platforms, and it's our responsibility to really think about that adversarial red teaming, and where is the misuse of the platform going to want to come from?

John Starr (31:43):
Yeah. What's the future of Trust & Safety?

Patricia Cartes (31:47):
I'm hoping, TSPA, I was thinking about it this morning.

John Starr (31:51):
Yeah.

Patricia Cartes (31:53):
I remember when the TSPA became an idea, because Adelin Cai, who used to work with us at Twitter, visited me at Postmates, a company where I was working in offline harms, and she brought it up and have this idea.

John Starr (32:06):
Of course.

Patricia Cartes (32:07):
I thought it was ... I mean, it was just her and a group of very impressive people, and then I was like, "I'm behind it."

John Starr (32:13):
Sure. Yes.

Patricia Cartes (32:13):
"We're going to join you. Absolutely." And now, four years on, I cannot believe it was just four years. It feels like the TSPA has been around for a long time. The Integrity Institute has been around for a long time.
(32:25):
They haven't. And so I think for me, the future of Trust & Safety is a lot more of this. Our awareness that we are a field of professionals, I think we knew, but we now feel entitled to feel pride for the work that we do.

John Starr (32:43):
Yup.

Patricia Cartes (32:44):
We tend to be the people that are in the background. When I meet parents of the kids that go to school with my kids, and they ask me like, "What do you do?," I'm like, "I work in online safety." And it feels weird to explain what you do, and I'm hoping that through the TSPA, even like universities are developing incredible curricula for Trust & Safety-

John Starr (33:13):
Totally. You can go to college for Trust & Safety now.

Patricia Cartes (33:15):
Yes, yes. So going back to what Sinéad had said, it's like-

John Starr (33:16):
Yes, exactly. It's evolved.

Patricia Cartes (33:19):
Maybe my kids could see my work and say, "I want to be a Trust & Safety professional and go to Stanford," and I mean-

John Starr (33:25):
It's huge.

Patricia Cartes (33:25):
Yeah. Who knows?

John Starr (33:26):
It's huge.

Patricia Cartes (33:27):
Yeah. So I'm hoping that we continue going down this professionalization. I don't know if that's a word. I'll make that-

John Starr (33:35):
I think it is.

Patricia Cartes (33:37):
And just feeling that pride because I think we deserve it. We see difficult content. We try to protect users. I don't know anybody in this space that does it with any misguided motivation. Nobody's here because they want to make millions of dollars.
(33:59):
It's like you're in this field because you want to prevent harm, and you want the internet to achieve its potential promise to society and advance society, and so I hope that we get to do more of that as a group of professionals.

John Starr (34:17):
That is a great way to wrap up the conversation. I found it to be very compelling, and I learned a lot about you today, which is so cool. Are you excited? Are you speaking at TrustCon?

Patricia Cartes (34:31):
I'm speaking at a Birds of a Feather event about the DSA.

John Starr (34:35):
Ooh. Okay. On that note, thank you so much for taking time out of your week to join us. I really appreciate it.

Patricia Cartes (34:47):
Thank you for having me. As you know, I am a huge fan. I just said it, you've inspired so much of my career, and it's such an honor to get to talk to you and also be within the Thorn universe because you are ... Don't tell anybody, but you are my favorite nonprofit and experts on the fields of child safety, so thank you. It's really an honor for me to get to talk to you.

John Starr (35:13):
Really appreciate it. Thank you.