Zuckerberg Needs to Acknowledge: Facebook is a Publisher
Zuckerberg and John Kerry (U.S. Government public domain image)
It distributes tens of millions of pieces of written content every day. Hundreds of millions of photos. Based on engagement, billions of readers consume that content daily. But is it actually a publisher?
Well, if it's Facebook, who knows? Kind of, I guess.
That's the fuzzy conclusion Mark Zuckerberg presumably wants the public to take away from a recorded "one on one" with Sheryl Sandberg.
And, you know, Facebook is a new kind of platform. It's not a traditional technology company. It's not a traditional media company. You know, we build technology and we feel responsible for how it's used. We don't write the news that people read on the platform. But at the same time we also know that we do a lot more than just distribute news, and we're an important part of the public discourse.
Wait. "We don't write the news" people read? Guess what? The New York Times itself doesn't write news. Nor does the BBC. News is written, at those outlets, by a mix of staff writers and freelance contributors. The significant difference between those news outlets and Facebook is not that Facebook's contributors (its members) aren't paid for their contributions—not directly anyway—but that the contributions aren't edited.
That's actually quite a big problem, given the volume of news Facebook publishes and its extraordinary reach. In fact, it's the Wikipedia problem, but worse.
The Wikipedia problem? It's the encyclopedia, remember, that "anyone can edit." With the exception of a few high profile articles, kept under lockdown by Wikipedia's unpaid administrators, anyone with an internet connection can add, remove or change content from Wikipedia at will. Unsurprisingly, this means that some of Wikipedia's millions of articles are badly researched, badly written, and contain false information (because there are so many, and most of them are good enough, casual readers might not notice this).
But what saves Wikipedia from complete chaos is that users can edit other users' work; sensible people can act as checks on complete nonsense. At Facebook, there isn't even a system of unpaid, crowd-sourced editing (users can comment on updates, but can only edit or delete their own).
The recent "fake news" debacle makes it even more depressing to see Zuckerberg hide behind weasel phrases like "not a traditional media company." At the end of the day, whether news is printed, published online, distributed by carrier pigeon, or written in the skies by an airplane, it's being published. And there are good reasons to think that what's published, especially to a large audience, ought in some way to be edited—and of course I don't just mean edited for grammar.
Facebook is, of course, taking some steps to address the problem:
- Allowing users to report fake news (a tiny step towards crowdsourced editing)
- Partnering with fact-checkers, and
- Restricting profits fake news purveyors can make from advertising (yes, contributors can be indirectly paid for publishing on Facebook).
The supposed concern, even with these light-handed measures, is said to be censorship. But this is just shoddy thinking. Editing something for truth, accuracy, and clarity is not censorship, and shouldn't be conflated with it. There's nothing to stop people publishing badly researched, poorly written, or just plain false news stories. You can start your own website for a few dollars. Publishers, however, are entitled to decline to publish unedited, misleading content.
And dress it up how you will, Facebook is a publisher. A big one. If it doesn't start to view itself as such, it's inviting a sharp decline into the kind of hostile and untrustworthy environment where individual users—and brands—will no longer feel at home.