Benedict Evans, a partner at the well regarded venture capital firm Andreessen Horowitz, recently argued that Facebook’s users shoulder much of the blame for the company’s recent issues. In his view, Facebook is forced to provide content its users demand to stay on top in a competitive market.
For those new to this topic, there have been a continuing drumbeat of stories showing the negative consequences of Facebook, from the dissemination of fake news to camouflaged political advertising by the Russians. In the last few days, reports show how Facebook allowed politicians and extremists in Myanmar to rally public opinion against the Rohingya Muslim, encouraging hate and killing innocent civilians.
Evans’s central point is that user choice is king. To illustrate, he compares Facebook to the fashion house Dior in the mid-20th century, when it broke from wartime restraint to dress the postwar woman in sumptuous fabric that would have been impossible during wartime rationing. Christian Dior succeeded because he captured the postwar mood, but he didn’t manufacture that mood. Facebook, Evans’s argument goes, must offer up what its users demand or be relegated to the graveyard of tech companies.
As a partner in venture capital, Ben Evans has a degree of insight that few others have, but his critique suffers from some basic flaws.
First, he misses how much power Facebook has today. Facebook isn’t an upstart fashion company in the 1940s, it’s the most powerful news distributor in human history. As a network effect business with such scale, Facebook is difficult to to compete against. Technology moves fast, but between the Facebook Feed, Whatsapp, Messenger, and Instagram, it’s likely Facebook will be used by billions around the world for years to come — with little user choice amongst social networks. The values of Facebook’s founder and employees are now the de facto standard for billions, with little societal input.
In the recent Myanmar case, Facebook worked with a local carrier advantaging them over competing social networks. For many citizens of Myanmar the internet is Facebook and Facebook is the internet.
As a venture capitalist, Evans looks for businesses with these characteristics. They’re highly profitable and long lived precisely because they’re not forced into costly competition — with winner-take-all results. In that context, user choice is hardly king and, critically, businesses can look beyond short-term engagement when making decisions. This is especially true once companies become so large and dominant. (I also wonder what his fellow partner Marc Andreessen would say given how much distribution muscle Microsoft used against Netscape in 1990s, and how it negatively impacted user choice.)
Second, Evans misses the debates that American journalism has had over the last 150 years about trading off user wants and needs. News has always played a delicate dance between engaging and informing readers. Today’s tech leaders are independently (and, to our detriment, slowly) re-learning the lessons that led to the current norms in journalism.
Substantial evidence suggests that the Spanish-American War was encouraged by the sensationalism of news organizations of the day and the growth of yellow journalism. News organizations found they could make up facts, leading to increased circulation — and war. In the decades after, journalistic ethics were established. All the modern journalistic principles we take for granted — fact checking, sourcing, objectivity, fairness — derive from this. For decades, much of American journalism has traded off profit to subsidize objective news, even though it hurt engagement.
Pulitzer vs Hearst in the age of Yellow Journalism
Fake news shows there is a market demand for content that supports our beliefs (“confirmation bias”). Under Evans’s thesis, companies that don’t allow the spread of fake news may be hurt in the marketplace; after all, based on their click behavior, users value fake news and may go elsewhere if they can’t find it. Using his argument, we should also question why The Wall Street Journal, The New York Times, NBC, or Fox News engage in fact checking, as it sets them up for disruption and costs profits.
I have many friends at Facebook, and know that Facebook’s product designers aren’t malicious people. But they need to understand their awesome power to influence what users believe and how journalists behave. Even though they don’t see themselves as editors, the algorithms they design are de facto curators of what one-third of the world sees. A human news editor may weigh engagement as well as accuracy and objectivity — while Facebook’s algorithmic editor will chase engagement at any cost. An algorithm designed to share friend’s photos should be augmented before applied to information we use to vote or make other important life decisions.
These algorithms also set incentives for every journalist in the world, dictating the content they must create to remain profitable in this era. Is it so surprising that our debates have become so shrill and one sided, when we solely chase engagement?
More concretely, Facebook’s choice to surface content shared by like-minded friends leads to a deep bias — and a highly selective view of the world. As I’ve written previously, I find this even worse than fake news, as “selective facts” are pervasive and we have a harder time questioning facts, no matter how cherry picked. It’s not too extreme to say Facebook is encouraging a new form of segregation where we join groups that choose their preferred facts.
Product designers should take this as a sign about how important it is to get your job right. Given the power you have, you can directly influence how users think — and be part of a movement encouraging objectivity, grounded views, and fact checking, even if it costs engagement or takes manpower.
It may seem to Evans that Facebook is misguidedly addressing some of these issues, but it is not just out of the goodness of Facebook’s heart. The engine of Facebook’s business, advertisers, have been demanding more. Employee morale (especially critical in tech) has taken a hit and Mark Zuckerburg’s own personal reputation has been diminished. There are wide ranging calls for governmental regulation. As such, it’s increasingly economically rational for Facebook to begin addressing these issues.
No matter Evans’s view, selling dresses and distributing news are very different businesses, just like selling prescription drugs is different than selling candy — even if both are food-based businesses. (More concretely, economists and business school professors would cite factors like externalities, public goods, rational ignorance, non rivalrous goods, and experience goods to highlight the differences)
Finally, Evans’s argument neglects to tackle a broader point. Few of us are ready for an era where we can choose the news we want. Regardless of Facebook’s role, we need to give users the tools they need to critically examine what they read. Even if we solve fake news, the “selective facts” we see shared in our bubbles of like-minded friends are here to stay. This is especially difficult as we transition from an era in the US where we expected news to be accurate and more objective than it is today.
Our society needs antibodies to confront this world of total media choice and well designed algorithms. Some courageous voices, especially among America’s teachers, have taken the lead in training the leaders of tomorrow in critical thinking for this new era of news. But there is more to be done.
Crucially, if we believe Evans’s central point, then there is a silver lining: if users clamor for change, Facebook will have to oblige or lose in the marketplace.
Given that, Evans and I can both agree on one thing: it’s up to users (as well as advertisers and Facebook employees) to demand more of the world’s largest news distributor.
Thanks to Camille Ricketts and Kim-Mai Cutler for feedback on this post. All views are solely my own.