Trust & Safety Approaches an Inflection Point
Thoughts on how to ensure this work continues to thrive and adapt
These past two months have been a period of deep introspection for me. You may have noticed this tone in my recent writing.
This past long weekend was filled with it. I had a ton of fun with friends, and the relaxing pace allowed me to step away from my day-to-day life to think and reflect—both on myself and more broadly.
Going into the weekend, I had many conversations with folks about a leadership change at the Integrity Institute, which expanded into broader ones about where the trust and safety community was going. This dovetailed into thinking about where I could have the most impact as this field reaches an inflection point.
While the term trust and safety has been around since the early 2000s, there was an inflection point after 2016 when a whole industry of researchers, non-profits, journalists, and tech employees was born to make sense of and mitigate the online harms that were now staring us in the face. In 2018, the Trust and Safety Professionals Association was born, and in 2021, the Digital Trust and Safety Partnership and the Integrity Institute were founded. With the start of TrustCon and the Trust and Safety Research Conference in the Fall of 2022, the community started to professionalize even more.
Little did we know at those September gatherings that a month later, Elon would kick off a string of layoffs across tech - including many of these workers. The organizations I mentioned above quickly filled a needed void - a place for the community to gather that they could no longer do at their place of employment. At TrustCon in 2023, it was the first time many former Twitter employees got to see one another in person after being let go. You could feel the collective healing they needed to go through that only they understood. It felt almost intrusive to watch.
The Facebook civic integrity community has experienced a similar trauma, which it has continued to work through since disbanding after the 2020 election. People and groups from other communities have also experienced their own traumas.
If there’s a silver lining to the layoffs, it is the number of people who could now share their wisdom more broadly than with just the next tech company to hire them. We saw people write valuable guides on elections, transparency, leadership, and policymaking vis-a-vis trust and safety. They advised policymakers and their staff. They helped make civil society recommendations to companies more actionable. They shared their thoughts on LinkedIn posts and started newsletters and podcasts.
Things are starting to shift, though.
People are getting full-time jobs. They’re heads down doing the work, whether on AI, elections, child safety, fraud - you name it. They lost steam. They’re afraid of being attacked as the work gets more politicized. They’re burnt out.
At the same time, a group of people who are very good at thinking about tough tradeoffs in the abstract struggle to talk amongst themselves about hot-button topics, from Ukraine to abortion to Israel/Palestine to elections. The work gets harder when you are personally affected by it.
Funders aren’t giving as much. People are becoming consultants or creating their own trust and safety products. Political figures are politicizing the work, causing a chilling effect. Companies are shifting their focus. AI is upending everything.
New generations are starting to join in on the work—founders and platforms who don’t have the scars from 2016. Some think they can avoid the same problems, but many quickly realize they can’t. They need trust and safety help but don’t have the resources to build any of it in-house. They might not even be able to have a full-time trust and safety lead. Moreover, the problems manifest differently, requiring a new approach and innovative thinking.
We are at a new inflection point. One that requires short and long-term thinking to ensure that this work can continue to thrive and adapt. Some thoughts on how we might do that:
Short-term we need to focus on not dividing ourselves and push back on being politicized. Things will only get more tricky as the rest of the year progresses. How can we think through how we want to have the more difficult conversations as a community? I will dust off my copy of Crucial Conversations and try to be self-aware enough to know when something bothers me. We also need to stand up for this work. We cannot hide and hope that those who think our work is not required will disappear. We must push back and bring more awareness about the important work trust and safety workers do daily. And, also be honest where we make mistakes.
Bonus point here: We better be ready to rebrand ourselves if trust and safety as a term becomes too politicized coming out of 2024.
Our new horizon needs to be 2029. Can you tell what my new thing is now that the 2024 elections are here and half are almost done? As I wrote last week, we are in for a lot of change and need a beacon we are pointing to. By this time, we should have built the updated support systems and skill sets for the profession and be executing on an entirely different plane.
Newer platforms need vendor and consultant solutions to focus on trust and safety earlier. First, a plug for the Duco Trust and Safety Market Research Report that we put out in March. In it, we describe the future in five years, where the market has significantly increased due to the need by platforms to comply with regulations and public pressure to be safe. I also think that just as you see fractional leaders for chief of marketing, finance, and other jobs, we need to start building a fractional trust and safety ecosystem. People who aren’t full-time in a company, but that leadership can turn to to help them solve some of their immediate issues. This, plus vendor and expert solutions, will become more key. (I know - this is all a bit of a shameless plug - but I also really believe it based on conversations I’ve been having with folks.)
Bring VPs and above into the conversation. I love that so much work outside the tech companies these last few years has been from the grassroots - those on the front lines. However, these people will start to move into leadership roles, and we need to equip them with the skills they’ll need at those levels. We have people already in leadership who play a huge role in deciding not just the funding and resources for these teams but also the policies themselves. They are weighing many different tradeoffs - Impossible ones, you might say - than those on the front lines have to. Play Trust and Safety Tycoon if you don’t believe me. But we must bring them into the conversation and discuss those tradeoffs and decision points.
We have strength in our shared community. I’ve been lucky enough to be a part of a few strong communities during my career. In the aughts, I became friends - and remain friends with many - digital political operatives who worked for Democrats. Part of that is while we may have had different views on certain issues, we had a common goal - get more politicians using social media. At Facebook, you had people across the political aisle, but we had a common goal of helping to connect the world. Post Facebook, I’ve been honored to be a part of a community of technologists, policymakers, civil society members, journalists, academics, and others who care a lot about these issues and want to get them right. Our strength is our ability to have hard conversations, step up when needed, and do the work no one else wants to do. Let’s not forget that when others want to tear us apart.
I got a little soap boxy there at the end. Forgive me, but I will keep it in as it flowed out of me, meaning it needed to be said. And I just realized I didn’t even get into how AI will change this field. That’s another newsletter for another day, but needless to say, I’m not going anywhere, and I look forward to figuring all of this out with everyone who wants to figure it out, too.
Please support the curation and analysis I’m doing with this newsletter. As a paid subscriber, you make it possible for me to bring you in-depth analyses of the most pressing issues in tech and politics.
This is a deeply meaningful post, Katie, and articulates some of the tensions I've been seeing over the last few months as well. Thank you for framing it so clearly.
Katie - I love and greatly appreciate your work, focus and compassion for the people who have dedicated their energy and skills for more trust and safety in our digital and IRL world. We see a great opportunity for impact and employment in developing the technology to complement and support ‘Better Together America’ (BTA - catalyzed by Mediators Foundation, with Bridge Alliance, NCDD, etc among the 100+ top level partner orgs and coalitions in their network). BTA is a massive emerging coalition and movement to start participatory Democratic Engagement Hubs in every state and Solution-roundtables in every county in the US. Please reach out if you would like to learn more. scott@vineberg.com