Elizabeth Warren.
Presidential hopeful Elizabeth Warren. Drew Angerer / Getty Images

By now you've heard the story: Elizabeth Warren wants to break up Big Tech, so her presidential campaign bought ads on Facebook talking about how Facebook is too powerful and should be cut down to size. Then Facebook blocked Warren's ads, making Warren's point for her.

This episode, Warren says, is a perfect example of why we should have "a social media marketplace that isn't dominated by a single censor."

But it's also a perfect example of an additional issue with Facebook: the social media giant's ongoing reliance on "outsourced content moderation."

The phrase, which I first noticed in this October 2018 New York Times editorial, describes how Facebook not only relies on users to produce its content for free but also, all too often, relies on users to police its content for free, too.

Facebook has lately hired thousands of moderators to deal with this issue (people working in low-paid positions that come with their own problems and traumas), but it's clearly not enough.

In the Warren episode, as in so many others, it was a journalist who had to nudge Facebook toward the correct content moderation response.

After Politico broke the story of Facebook blocking Warren's ads, the company backtracked and restored the ads in the interest of what it called "robust debate."

This kind of thing has happened many times before, in all kinds of variations. Very often, it's journalists pointing out content that Facebook hasn't taken down but should.

"Social media misinformation is becoming a newsroom beat in and of itself," the Times editorial explained, "as journalists find themselves acting as unpaid content moderators for these platforms."