Everything is politics but politics is not everything
Not all decisions tech companies make are rooted in politics
Created via ChatGPT with the prompt “image that depicts everything is politics but politics is not everything.” The tool described the image as: “The image depicting the concept "everything is politics but politics is not everything" is shown above. It illustrates a large, complex machine interwoven with various aspects of everyday life, contrasting with a serene landscape representing areas of life untouched by politics.”
Before I get started, there are a few quick things to note. First, I borrowed this headline from a 1986 book by H.M. Kuitert. I can’t find much about the book online other than it came up in a search when I wanted to see if the phrase had history, but it encompasses what I want to say, so I hope folks do not mind me borrowing it.
Second, this was not the first or second thing I wanted to originally write this newsletter on. At first, I thought I would do it on the Supreme Court arguments on the Texas and Florida content moderation laws, but there are much smarter people than me you should be following on that, such as Daphne Keller and Kate Klonick. Tech Policy Press has had some amazing pieces as well.
Then I wanted to do it on this poll the Bipartisan Policy Center, Integrity Institute, and States United released about who voters trust for election information in 2024. It’s a report full of interesting facts, including:
Most Americans have confidence in the 2024 presidential election. They are more confident that votes in their community and state will be counted accurately than votes across the country.
When asked about election-related concerns, Americans point to misleading election information, violence after Election Day, and attempts to overturn election results.
Americans learn about elections primarily through television and social media.
Read the whole thing. I’ll maybe do more on it this weekend. However, I sidelined it after reading Ben Thompson’s latest on Gemini and Google’s Culture.
For those of you who might not have been following this story, last week, Google came under fire for its AI tool, Gemini, refusing to return images of white people but would other races. The company took the image generator offline for a few weeks amidst the complaints. Some, like Ben, are prescribing the tool acting this way on the political leanings of the company culture and the people who trained the model. In his Dithering podcast he theorizes that no one in Google felt comfortable speaking up that the model should be changed.
I’ve written extensively about tech companies’ approach to politics and how some try to avoid getting pulled into the political fray. What I haven’t touched on, and Ben is in his piece, is the reverse - when people prescribe a tech company decision or action on politics.
First, I am not saying politics doesn’t play a role in company decision-making. In today’s day and age of being dragged into Congressional or Parliamentary hearings, in front of courts and the top headlines is absolutely taken into consideration when some decisions are made by companies. Just look at all the companies' announcements in the lead up to the last hearing about child online safety so they’d have something to talk about.
However, while some might think the world revolves around politics … it does not. In my experience both at Facebook and in talking to my peers in the industry, when something happens like the Gemini case last week or a piece of content being removed, it’s usually because:
It was a bug - Somewhere, a classifier misidentified something, or a product just broke. No one purposely sought out content because it was left or right; it was just a mistake.
Teams were rushed - From talking to folks; this seems like the most likely answer on the Gemini stuff. They are feeling the heat of being behind on AI and pushing the product out the door without giving the teams proper time to do all the red teaming and other testing needed to ensure something like this didn’t happen.
Incompetence - The people building the product didn’t even think to be testing up against these types of queries, didn’t bring in people who could because they’d likely slow down the release - see point two - and didn’t know to ask these questions.
Prioritization - The company knew the product had these issues but chose to focus on issues it felt were more important and release it anyway.
I can’t tell you the number of times I’ve had people - usually on the right – think that there were people at the companies purposely seeking out their content to remove or suppress it. And then the number of folks on the left who think that the more right-leaning members of the company were purposely trying to help Republicans.
I just never saw it. People had their political leanings, but I can’t name a single conversation I’ve been in during my time at Facebook or outside of it with other companies where the question was how we help advance a particular political cause. I also have a hard time believing that no one at Google spoke up about the problem. I think either it’s that the people whose job to do that might not be there anymore and/or they knew but still rushed the product out. That’s why Ben’s last paragraph concerns me:
“That means, first and foremost, excising the company of employees attracted to Google’s power and its potential to help them execute their political program and return decision-making to those who actually want to make a good product.”
I’ve known Ben for almost 25 years now - since our days at UW-Madison - and I have a hard time believing he’s advocating for a sort of McCarthyism within tech to do what he says above. But it sure sounds like it. This is a dangerous path to go down.
Now, people do have biases. At Facebook, Sheryl spearheaded a class on unconscious biases with the Learning and Development team. It helped people at least think through where they might be bringing a particular point of view to a problem and other perspectives. That included ideological biases.
I can already feel some of you rolling your eyes at the concept of a class on unconscious bias. But, I’ll tell you this: I had a lot of product people come up to me over the years asking for my point of view on something because they realized they had a more liberal perspective and wanted to gut-check their work. Better than not being aware of it.
Unconsciously, building something into a product is not the same as an intent to push a political position. If anything, platforms are continuously on a fool’s errand to show that they aren’t being biased.
And, what good has that done them? They still keep getting yelled at.
I get the desire just to build good products. It’s a nice dream. It’s not reality. It comes back to what I’ve been saying:
Platforms can run, but they can’t hide from politics.
The reality is everything they do will be turned political in this environment. They are an easy foil for those across the spectrum. As a friend told me when we discussed this, “People will see a conspiracy when they want to see one.”
There will always be something - a difficult trade-off - that someone will be able to spin as being too woke or too conservative.
The inclination of these companies is to shut up. Not explain their thinking. If you are explaining you’re losing, goes the old Ronald Reagan quote. However, as I mentioned the other week, if you want to change the world, people will notice, and you will have to show your work. You’ll also have to know better when to distinguish signal from political noise so you can keep doing good work.
Ben laments in his piece that he wants to avoid politics but can’t. He says people in tech need to write more articles like his. I agree; we do need more voices in this debate. Companies need more people pushing and asking the hard questions.
But the solution is not to try to eliminate politics. Instead, those inside of tech need to accept that you can’t remove the politics from anything you build. Those outside of tech need to accept that sometimes, politics is not at the center of everything tech builds.
Please support the curation and analysis I’m doing with this newsletter. As a paid subscriber, you make it possible for me to bring you in-depth analyses of the most pressing issues in tech and politics.