2016-11-10

Yesterday I wrote that people rushing to blame Facebook for the election results were being ridiculous, and it generated a fair bit of discussion (much of it on Twitter). And this was before NYMag's Max Read went out and wrote an article literally titled Donald Trump Won Because of Facebook. Here's the crux of Max's argument, which is similar to the argument many others have been making:

The most obvious way in which Facebook enabled a Trump victory has been its inability (or refusal) to address the problem of hoax or fake news. Fake news is not a problem unique to Facebook, but Facebook’s enormous audience, and the mechanisms of distribution on which the site relies — i.e., the emotionally charged activity of sharing, and the show-me-more-like-this feedback loop of the news feed algorithm — makes it the only site to support a genuinely lucrative market in which shady publishers arbitrage traffic by enticing people off of Facebook and onto ad-festooned websites, using stories that are alternately made up, incorrect, exaggerated beyond all relationship to truth, or all three. (To really hammer home the cyberdystopia aspect of this: A significant number of the sites are run by Macedonian teenagers looking to make some scratch.)

All throughout the election, these fake stories, sometimes papered over with flimsy “parody site” disclosures somewhere in small type, circulated throughout Facebook: The Pope endorses Trump. Hillary Clinton bought $137 million in illegal arms. The Clintons bought a $200 million house in the Maldives. The valiant efforts of Snopes and other debunking organizations were insufficient; Facebook’s labyrinthine sharing and privacy settings mean that fact-checks get lost in the shuffle. Often, no one would even need to click on the story for the headline to become a widely distributed talking point, repeated elsewhere online, or, sometimes, in real life.

Meanwhile Bloomberg had a big piece, saying that Facebook (and Twitter) employees are "grappling with their role" in helping to elect Trump.

Online (on Facebook, of course), current and former employees debated the company's role as an influencer. Bobby Goodlatte, a Facebook product designer from 2008 to 2012, according to his LinkedIn, today said the company's news feed was responsible for fueling “highly partisan, fact-light media outlets” that propelled Donald Trump's ascension to the presidency. “News feed optimizes for engagement,” Goodlatte wrote. “As we’ve learned in this election, bullshit is highly engaging.”

These stories sound convenient. And my Twitter feed is chock full of people -- often people in the media who are already "angry" about Facebook "stealing" their ad revenue -- making similar noises about how Facebook needs to "fix" this.

Fix your platform. You're a media company and with great power comes great responsibility. https://t.co/fsGSSEkIMK

— Kelly Fincham (@kellyfincham) November 10, 2016

Dear @twitter and @facebook - do the right thing, and soon. Maybe started as a 'platform,' or tool etc. yes but we're way past that now. https://t.co/gqA2FS3akX

— Kaan Yigit (@kyigit) November 10, 2016

When the dust settles, we gotta have a conversation about Facebook's responsibilities to the public. https://t.co/jK55IjfXoo

— Matt Pearce (@mattdpearce) November 9, 2016

And these stories tell a neat and convenient tale, a pre-packaged "thing" to blame. And they're all bullshit. Yes, Facebook had lots of people passing around fake news stories, or misleading news stories. And, yes, lots of people live in bubbles where they only see/read/hear stuff that they are prone to agree with. But this narrative that it was Facebook's "primed for engagement, not truth" algorithm that got people to go out and vote for Trump is both simplistic and dangerous. Let's take each problem separately.

Too Simplistic:

Blaming the Facebook algorithm for sharing fake news is too simplistic in that it gives the algorithm too much power and takes the responsibility away from human beings as living, thinking creatures. We love to blame the tools. It's practically a national pasttime, searching for the moral panic du jour to blame for people doing things that some other people don't like or find problematic. It's much easier to blame the tools.

Even worse is that it assumes millions of people are pure idiots. And, I know, among many people this may be a popular opinion right now. That if they supported "the other side" they must be complete idiots. But that's wrong. There are idiot supporters of every candidate in this election -- and we can all highlight our favorite that somehow got onto the news. But lots and lots and lots of people who voted for Trump weren't doing so because some Facebook algorithm "tricked" them, but because they legitimately believed that the status quo wasn't working and was problematic, and an awful lot of "the establishment" wanted them to shut up about what wasn't working. You can argue that they were misled about what was and what wasn't working, but again, that goes back to the idea that tens of millions of people are so stupid that they change their minds based on fake storis on Facebook

Too Dangerous:

I write an awful lot about Section 230 of the CDA and the idea of "intermediary liability" protections and I know that some people's eyes glaze over at those terms. But there's a fundamental underlying principle behind those things and it's this: if you blame a platform for the actions of its users, you end up with massive censorship and dangerous limits on free speech and innovation.

The people calling for Facebook to "fix" this problem don't see where this leads, but it's not good. In various conversations I've had in response to yesterday's article, I keep drilling down and trying to see what people think the "solution" to this "problem" is, and it inevitably comes back to something along the lines of "well, Facebook needs to stop the fake news from spreading." If only it could. Fake news, rumors, conspiracy theories, echo chambers and "bubbles" predate Facebook by a long shot. While the musical Hamilton is reminding people that some of our founding fathers were known to fight hard against each other, not everyone is aware of the spreading of rumors and lies between Thomas Jefferson and John Adams as they campaigned for the presidency in 1800:

Jefferson secretly hired the famed pamphleteer James Callendar, who had previously seriously damaged the reputation of Adams' fellow Federalist Alexander Hamilton, to paint Adams and the Federalist party as a friend to British royalty and Adams as being bent on starting a war with France in order to further an alliance with King George. More to the point, Callender described Adams as a "hideous hermaphroditical character which has neither the force and firmness of a man, nor the gentleness and sensibility of a woman."

Adams' Federalist surrogates also brought out the proverbial long knives. A Federalist publication described Jefferson as "a mean-spirited, low-lived fellow, the son of a half-breed Indian squaw, sired by a Virginia mulatto father." Allegations were made that he cheated his British creditors, was a supporter of French radicalism and assassinations of the aristocracy, and that he made a habit out of sleeping with his female slaves.

Or read about the history of the 1828 election between Andrew Jackson and John Quincy Adams and you might notice more than a few parallels to today -- including the spreading of fake stories about each candidate by surrogates. Here's just a snippet:

One Adams newspaper even wrote, "General Jackson's mother was a common prostitute, brought to this country by the British soldiers! She afterward married a mulatto man, with whom she had several children, of which number General Jackson is one!"

In 1876, opponents of Rutherford B. Hayes spread the rumor that he had shot his own mother. In 1928, supporters of Herbert Hoover started spreading rumors that (the Catholic) Al Smith was connecting the newly built Holland Tunnel in NY all the way to the Vatican so that the Pope would weigh in on all Presidential matters. In 1952, Dwight Eisenhower supporters distributed pamphlets claiming that his opponent, Adlai Stephenson had once killed a young girl "in a jealous rage."

Point being: fake news is spread in basically every election for the US President in history. It didn't take Facebook's algorithms and it won't go away if Facebook's algorithms change.

In fact, it's likely to make things even worse. Remember the mostly made up "controversy" about Facebook suppressing conservative news? Remember the outrage it provoked (or have you already forgotten?). Just imagine what would happen if Facebook now decided that it was only going to let people share "true" news. Whoever gets to decide that kind of thing has tremendous power -- and there will be immediately claims of bias and hiding "important" stories -- even if they're bullshit. It will lead many of the people who are already angry about things to argue that their views are being suppressed and hidden and that they are being "censored." That's not a good recipe. And it's an especially terrible recipe if people really want to understand why so many people are so angry at the status quo.

Telling them that the news needs to be censored to "protect" them isn't going to magically turn Trump supporters into Hillary supporters. It will just convince them that they're even more persecuted.

Other than "censoring" certain content, the only other suggestion I seriously heard was someone suggesting that Facebook should force feed its users opposing views. Like that's actually going to change anyone's mind, rather than get them pissed off again. And, once again, this seems like people failing to take responsibility for their own actions. If you don't have any friends who supported Trump, don't lump that on Facebook.

There are legitimate questions about can you better inform a populace. But censorship and forcefeeding information is general paternalistic nonsense that totally misunderstands the issue and misdiagnoses the problem. As Clay Shirky noted earlier this year, too many Hillary supporters thought that "bringing fact checkers to a culture war" would win out, when that's never going to happen. Fighting Facebook's algorithim is more of the same nonsense. It's based in the faulty belief that those who voted for "the other" are simply too dumb to understand the truth, and if they just got more truth, they'd buy it. It's not understanding why they voted the way they did. It's looking for easy scapegoats.

Facebook's algorithm is an easy target, but it's even less likely to solve a cutlure war than fact checkers.

Permalink | Comments | Email This Story



Show more