Deconstructing Facebook and Google: why are we seeing what they’re showing?

Two seemingly disparate stories or storylines involving two of the most dominant content distributors on the planet.

In one, our friend Mathew Ingram discusses Mark Zuckerberg’s video initiative, which, he says, undercuts the Faceborg supremo’s insistence that he’s not running a media company. In a subsequent post, he highlights Faceborg’s ostensible commitment to fighting fake news.

It’s not hard to see why Zuckerberg resisted the characterization as long and as hard as he did. We’ve talked about that previously ourselves. And as Ingram points out:

Facebook likes things that are neat and tidy, like algorithms—not things that are all muddy and gray and complicated, like defining what constitutes fake news.

Well, we’ve all seen how effective the algorithms are at distinguishing genuine, authentic content from bullshit. And we’ve already talked about how those algorithms are shaped by your ultimate goal: do you want engagement, or do you want veracity? Do you want to be clicky, or do you want to be authentic? Can’t always have both. And which one you prioritize is going to determine what floats to the top of your menu.

There’s no great insight in observing that this is going to get a lot messier before it gets any neater. The accusations of bias, censorship, lack of transparency, and hidden agendas are going to be deafening, and they’re going to be coming from all sides. The language is going to be heated and ugly. If there’s any small comfort to be drawn from this, and it’s a big “if,” it’ll be in Facebook’s acceptance of responsibility for the content it serves up.

(In any event, it might all be academic anyway. As our friend Jonathan Albright argues, fake news is soon to become the least of our problems.)

Google ranks Holocaust denial #1

Secondly, a disturbing piece in the Guardian by Carole Cadwalladr. When she tried a Google search involving the Holocaust, the first thing that happened was that the search bar auto-completed her query to read “did the Holocaust happen?”

And there, at the top of the list, was a link to Stormfront, a neo-Nazi white supremacist website and an article entitled “Top 10 reasons why the Holocaust didn’t happen”.

She then recounts Google’s insistence that it would not rewrite its search algorithm* or remove the results, despite its declaration that it did not endorse those views. Eventually Cadwalladr did an end run around the organic search results by buying a paid Google ad that bumped Wikipedia’s entry about the Holocaust to the top of the page. For now, at least.

The rest of the piece examines how and why such a self-evidently repugnant outcome becomes possible. Not so much about why Google won’t edit the results, but why Stormfront would rank so highly — and, unsurprisingly: it comes down to money:

” … empirically speaking, people tend to treat Google like an authority. So this is an appalling shirking of responsibility. It’s about money. It always is. The commercial imperative trumps all other aims at the company, including moral ones.”

Why this content and not that?

So, a few revealing insights about what motivates two of the most powerful content platforms on the planet. These entities control what we see, what we read, what we’re exposed to, and what we consume. These entities control the vast majority of the information available to us. If they don’t want to show it to us, chances are we’re not going to see it.

What lessons do we draw from this? Once again: the importance of critical thinking. Why is Facebook serving up this story and burying that one? Why is Google ranking this at the top of its search results, and not that? What are we not seeing here? Why is our attention being directed to this thing at this time? There’s no need to go full-bore conspiracy theory here — just a healthy skepticism and willingness to do the work.

*In the spirit of disclosure, there are times when one doesn’t necessarily want Google to rewrite its algorithms.

Sign up for regular updates

* indicates required