… well, no real need to finish the sentence, is there.
Dear Mark, you're wrong. Met someone Monday who "learned on Facebook" Pope had endorsed Trump and that was meaningful. Own it. Help fix it. https://t.co/8qJqdKSuji
— Jason Kint (@jason_kint) November 11, 2016
Mark Zuckerberg: To think fake news on Facebook influenced the election in any way is a pretty crazy idea https://t.co/05KI2bUZkS @USATODAY
— Jessica Guynn (@jguynn) November 11, 2016
It’s understandable why Facebook would want to deny wielding influence. And why it would deny being a source of news, or anything that implies that it bears any responsibility for what people believe as a result of the content it delivers.
It’s not hard to see why. If you’re a source of news, then you assume a whole raft of ethical obligations: ensuring that what you say is credible, avoiding conflicts of interest, making sure your marquee columnists aren’t plagiarists … Too basic and too numerous to set out here. It would be easier to just shrug your shoulders, throw up your hands, and claim, over and over again, that you’re not responsible for the way people react to what you put out there.
But it would be wrong. In fact, it is wrong.
We’ve already discussed Faceborg’s problem with fake news, and how its algorithms don’t filter worthwhile and credible content from manufactured crap. What happened yesterday? Not even the tip of the iceberg.
It’s a real problem, and it’s not going away. The sooner it gets recognized, the better.
Update: As usual, @mathewi is way ahead.
[…] we’ve already seen what Zuckerberg thinks, but over on Buzzfeed, Sheera Frenkel’s citing a group of Faceborg staffers who don’t […]
[…] platforms. We’ve talked here about the effect of the, er, “suspect” content served up by Faceborg, and whether FB’s properly accepted its responsibility for that. And we’ve noted Twitter’s […]