Hello from Interstate 94. My dad and I are headed west to North Dakota to do some duck hunting after spending some time home in Wisconsin. When I planned this, I didn’t realize it would be one of the biggest weeks for Facebook in its history. I want to thank my family for understanding and my brother and sister-in-law, who let me take over their dining room as a makeshift studio Tuesday.
Kenny Chesney has a song called “Knowing You” on his latest album. At the beginning of the song’s music video, he does a voice-over that says, “Not everything is meant to last. You don’t think about that when it’s perfect. And when it’s over, there’s a huge hole, but there’s also a pair of wings.”
This sentiment is how I feel about the team I built at Facebook and my colleagues on the Civic Integrity Team. We weren’t perfect - far from it - but what we had was special. We were a group of brilliant, dedicated people bound by a mission to improve the internet’s impact on civic discourse and elections. And we didn’t realize just how special our group was until Facebook re-organized these teams. Frances Haugen - the Facebook whistleblower and former Civic Integrity team member - said it well in her testimony that it felt like a “dissolving of community.”
The pair of wings in this story is that now we have many integrity professionals - some still at Facebook, some at other companies, and some working in different areas around tech and democracy who understand the complexities of this work and are helping the public, Members of Congress, the media and many others try to understand this all better.
The testimony this week is a milestone in this public debate. It was the most substantive hearing I’ve seen on this topic. No yelling, no forced yes or no answers, but rather actual time for explanation and discussion. If only the hearings with tech execs could be the same way. That would require Members of Congress to give up a chance for a potential sound bite and the tech executives to provide some actual insight and substance to their answers.
I’ve heard mixed reactions to the fact that Frances took these documents and released them. On the one hand, taking and leaking company documents can be against the law. It erodes trust inside the company, exposes the people who created the docs without their permission and the documents can often be taken out of context. On the other hand, having this information out in the open seems to be helping elevate the debate. Many folks are in-between where they don’t like the leaking of documents but are glad the information is out there.
I’m in that in-between camp. We need to be talking more about the tradeoffs, priorities, and decisions that Facebook and all tech companies make. We need to be realistic about the hard choices and not just make simplistic declarations that they should spend more money or that Facebook should stop putting profits over safety. While both of those things are true, it’s not that simple. Hiring enough people with the right expertise for these emerging issues is challenging because the pool is small. I don’t recall ever being in a meeting where how much money Facebook made was a point of discussion (doesn’t mean it wasn’t in other ones, I’m just saying the ones I was in), but many times the tradeoffs we were debating were around balancing free speech and harms.
The documents we’ve seen present part of a perspective of the decision-making in a place like Facebook. I hope the public can get a complete picture of how companies make decisions and prioritize when doing so. We should hear more from people who can share their expertise. I hope that I can help play a role in this too.
Frances said in her testimony not to believe the false choices Facebook gives. I would argue we shouldn’t believe the false choices anyone in this debate provides. The answers to these problems are not black and white - and for a tech world that likes to work in ones and zeros and a political world that works in reds and blues, that’s hard to swallow. Instead, we need to be looking at nuanced solutions so we can find the right balance. Determining where to draw those lines is where the honest debate lies.
This brings me to a point I wanted to make about algorithms. It was a large portion of the debate yesterday and is an interesting one around choice. When thinking about what content people see in their newsfeed, you need to think about four things.
First are the choices someone makes when they first sign up for the service. This is where they send friend requests, like pages, and join groups.
Immediately upon signing up and making a few of these choices, an algorithm then comes in to make recommendations to you. This is Facebook or any other platform making choices of what to show you based on the other things you like and what is popular (or whatever else the platform wants to show you).
Then another algorithm comes in to sort the content from the things you’ve chosen to follow because there’s usually way too much for anyone to consume in a day.
Finally, another algorithm decides which ads to show you based on the advertisers wanting to reach you.
Allowing only people to make choices can be bad for reinforcing filter bubbles (as much as they actually exist online). And we’ve certainly heard a lot about how just letting the algorithms be in charge is problematic. I like proposals that Daphne Keller has written about where we give people choices about how they want their content filtered, and then we likely need a surface for people to go to when they want content picked for them.
Finally, a question I’m getting a lot in all of this is how Congress should regulate Facebook. Politico Nightly asked me this same answer, and I wanted to share it here. This is only very much a top-line of ideas.
The government should look at regulating Facebook and social media in five areas:
Transparency: Congress could compel companies to make more data available to researchers and be more transparent about their decision-making processes. Nate Persily from Stanford has a good proposal for this that he released.
Privacy: We need privacy legislation that puts guardrails around how people's data can be used and exposed - including for research.
Definitions: Congress needs to help Facebook define what content constitutes things like a political or issue ad and what - if any - restrictions political figures and organizations should have online. For instance, should campaigns be required to be transparent about what types of data and targeting they use outside of platforms for their advertising.
Competition: They need to put in rules that help to increase competition, such as requiring data portability so that people can port their network of friends to new platforms.
Accountability: We shouldn’t allow CEOs like Mark Zuckerberg to have no accountability for their decisions. Right now, he controls the majority of shares, and there is no recourse for when people think he’s made the wrong decisions.
Sorry for such a long newsletter this week. It’s been a lot, and I didn’t even touch on the outage. As I said to Politico, I think we’re only at halftime right now. Presumably, there are more Wall Street Journal stories to drop based on the documents; there’s the question if these docs are made public and if Congress calls on Mark or other execs to testify.
So buckle up. This Fall rollercoaster is just getting started.
What I’m Reading
There’s no way I was going to capture everything I’ve been reading this week. But here’s a subset.
Stratechery: Ben Thompson has yet another excellent piece looking at Facebook’s political problems
Carter Center: One of my projects post-Facebook has been working with the Carter Center on this paper with ideas on what to do about repeat offenders online.
Washington Post: Nate Persily has some ideas on how to get researchers’ data from platforms - Perspective | Facebook hides data showing it harms users. Outside scholars need access.
Al Jazeera: Singapore parliament to debate 'foreign interference' law | Freedom of the Press News
Jobs
Democracy Works is seeking a strategic, committed leader to serve as its Chief Executive Officer. Please visit https://www.on-ramps.com/jobs/2349 for more information and apply!