Discover more from Anchor Change with Katie Harbath
Future of Content Moderation
What this week’s debate is missing
In 2019 I was asked to give an ignite talk - where speakers are given 5 minutes and 20 slides that auto-advance every 15 seconds - and I chose to talk about free expression and content moderation online.
My message was twofold:
Be careful about regulating speech because those rules could be turned against you
Content moderation is hard
During this speech, I walked people through a little bit of the history of free expression - including that Voltaire never said, “I disapprove of what you say, but I will defend to the death your right to say it.” I also walk people through some examples of hard content calls and ask them in 15 seconds to make a call on if they would leave it up or take it down. These were based on a series the Guardian did in 2017 called the Facebook Files (Seems like this is a common moniker) about hundreds of leaked documents about how the company does content moderation.
In this series, they created a quiz for people to try being a content moderator. I encourage you to take it. I remember when I first did it I even got a few wrong and I thought I knew our content policies pretty well at the time.
Three LONG years later this debate rages on. This week it was the news that Elon Musk was buying Twitter and his neverending trolling tweets talking about how he would restore the platform to a place for free expression.
Plenty has been written about Musk. I’ve talked about it enough this week and even took the bait once in responding to his tweets when I got really upset at how he was going after Twitter exec Vijaya Gadde. I finally had to stop looking at what he was putting out because it just kept making me angry at how he was conducting himself. I love how Charlie Warzel put it at the end of his newsletter:
“I watch my own impulse to engage with the richest man in the world online. I’m thinking today about what’s gained and lost in that process—a legion of well-meaning tweeters bringing stats to a meme fight. I feel my own capacity to attend to the world is somewhat wrecked these days. Maybe you feel it, too.”
So, instead of going point by point about all the ways, I’m nervous that Musk doesn’t know what he’s getting into I want to instead focus on two paragraphs that Mark Zuckerberg said during Meta’s earnings call this week.
Now, I know it must be nice that for once Meta/Facebook is not the center of controversy. Part of me wondered this week if maybe this is finally the moment that the target moves off of Mark’s back and Musk becomes the new content moderation lightning rod for a while.
Regardless, these two paragraphs are really important as they signify a pretty important shift in how Facebook thinks about its newsfeed. Here’s what Mark said (emphasis mine):
The second point is that while we're experiencing an increase in short-form video, we're also seeing a major shift in feeds from being almost exclusively curated by your social graph or follow graph to now having more of your feed recommended by AI, even if the content wasn't posted by a friend or someone you follow. Social content from friends and people and businesses you follow will continue being a lot of the most valuable, engaging and differentiated content for our services, but now also being able to accurately recommend content from the whole universe that you don't follow directly unlocks a large amount of interesting and useful videos and posts that you might have otherwise missed.
Overall, I think about the AI we're building not just as a recommendation system for short-form video, but as a Discovery Engine that can show you all of the most interesting content that people have shared across our systems. In Facebook, that includes not just video but also text post, links, group posts, re-shares and more. In Instagram, that includes photos as well as video. In the future, I think that people will increasingly turn to AI-based Discovery Engines to entertain them, teach them things, and connect them with people who shared their interests. And I believe our investments in AI, all the different types of content we support, and our work to build the best platforms for creators to make a living will increasingly set our services apart from the rest of the industry and drive our success. We're also finding that having an ambitious vision around building the world's Discovery Engine is attracting a lot of the most talented AI folks to work on this program.
What does all this mean? As, Ben Thompson wrote this week, it means that the company is, “abandoning the social graph as the core organizing feature of the Family of Apps experience.”
This might not seem like a big deal but it is - especially when you take into account all the conversations about algorithmic transparency.
What this means is that Mark is moving the Facebook newsfeed to be more like TikTok’s. Your feed will be filled more and more not with content that you’ve chosen to see but rather what the algorithm thinks you want to see based upon other content you’ve watched and engaged with.
We’ve seen and heard plenty about how this can potentially send people down problematic rabbit holes. Though new research also shows that maybe recommendation engines don’t do this on YouTube.
Regardless, this is going to make figuring out how we can better understand how the platforms are designed and how their ranking algorithms work all the more important. If you want to learn more about this my colleagues at the Integrity Institute have some good resources.
I think that this is also going to shift how we need to think about content moderation and free expression. If people are going to platforms more to be entertained rather than make their own content then we’re moving more into conversations about deplatforming and Spotify’s decision to pay Joe Rogan for content. (I realized that the stories I link to talk about how deplatforming hasn’t necessarily worked nor caused Spotify to lose followers, which I think is important to remember. Jared Holt’s piece in the Daily Beast about Alex Jones is also worth reading.)
This means we’re moving into a world where platforms are making more content decisions in the form of more curation of what you see not less. This content will be from creators rather than regular people. Instagram’s Adam Mosseri focused on this in his recent TED talk where he said, “Over the next ten years, I think we will see a dramatic shift of power away from platforms, like the one my team and I are responsible for, and towards a group of people I am going to describe as creators.”
I’m not sure that’s entirely true when the platforms and their recommendation algorithms like those Mark talked about hold so much control over how much people see that content. Creators already get frustrated not knowing if one day the algorithm will love them or hate them the next.
The topic of what these creators can and can’t say will still be a topic of debate, but so will how the platforms choose what to boost or not. But what does this mean for everyday people? The ones the public square is supposed to serve and Musk supposedly wants to build?
Perhaps this will move the content moderation conversation not just into the rules platforms have for content but the principles and transparency the platforms put into how they boost that content. Knowing that is just as important as how they draw the line between what is or isn’t allowed in your feed.
PS: Another story that got buried this week was the fact that Europe came to an agreement on some key provisions for the Digital Services Act. This is going to have a huge impact on platforms as well. If you want a good cliffs notes version of what is in it check out this piece by Daphne Keller.
May 1 @ 11am Eastern: Friends Kristen Soltis Anderson, Moira Whelan on Reliable Sources. Moira will be talking about disinformation and Ukraine.
May 4 @ 2pm Eastern: Brandon Silverman, Nate Persily, Daphne Keller, Jonathan Haidt and Jim Harper testifying on Platform Transparency: Understanding the Impact of Social Media Subcommittee Hearing
The Atlantic: Running Twitter Is Going to Disappoint Elon Musk
Tech Policy Press: International Coalition Launches Declaration for the Future of the Internet
Africa Center: Mapping Disinformation in Africa
Ranking Digital Rights: The 2022 RDR Big Tech Scorecard
New Zealand: Digital Violent Extremism Transparency Report
Deb Liu: How To Get Promoted
The Center for Policing Equity: Vice President of Technology and Data Strategy
NEW: May 20 - All Tech is Human: Responsible Tech Summit: Improving Digital Spaces
Topics to keep an eye on that have a general timeframe of the first half of the year:
EU Passage of DSA and DMA
Facebook 2020 election research
Oversight Board opinion on cross-check
Senate & House hearings, markups, and potential votes
May 3 - Ohio Primary (Open Senate race)
May 9 - Philippines elections
May 17 - North Carolina and Pennsylvania Primaries (Open Senate races)
May 20 - All Tech is Human: Responsible Tech Summit: Improving Digital Spaces
May 21 (On or before) - Australia elections
May 23 (tentative): World Economic Forum, Davos
May 24 - Alabama and Georgia Primaries (AL open Senate race, GA Warnock defending seat)
May 29 - Colombia elections
June 6 (week of): Summit of the Americas, Los Angeles, CA
June 6-10: RightsCon, Online
June 6 - 7: Atlantic Council 360/Open Summit
June 9 - 10: Copenhagen Democracy Summit, Copenhagen, Denmark
June 25 - July 1: Aspen Ideas Festival, Aspen, Colorado
June 14 - Nevada Primary (Cortez Masto defending Senate seat)
August: Angola elections
August 2 - Arizona and Missouri Primaries (AZ Kelly defending Senate seat, MO open Senate race)
August 9 - Wisconsin Primary (Ron Johnson defending Senate seat)
August 9 - Kenya elections
September 11 - Sweden elections
September 13 - New Hampshire Primary (Hassan defending Senate seat)
September 28 - 30: Athens Democracy Forum
October 2 and 30 - Brazil
November 8 - United States Midterms