2014-09-08



The intimate celebrity photos that appeared online over Labor Day weekend didn't originate on Reddit — they showed up first on the image board 4chan. But as lawyers for the celebrities worked to scrub the photos from the web, a Reddit community called TheFappening became a clearinghouse for the images. This weekend, Reddit shuttered TheFappening, prompting outrage from free speech purists and lecherous men alike.

Many prominent sites that host user-generated content have terms of use that exclude this kind of material. But Reddit is different. The site takes an absolutist approach to freedom of speech, refusing to take down material unless doing so is required by law. Some critics accuse Reddit of hypocrisy for banning TheFappening while taking no action against other communities that have raised eyebrows. There are Reddit communities devoted to racism, violence women, sex with animals, and many other ethically problematic topics.

But Reddit says it didn't ban TheFappening because of its content, and it has no intention of changing its absolutist free-speech stand. And under US law, there may not be much critics can do about that. The law gives websites broad immunity from liability for content posted by their users.

What's a subreddit?



Actress and hacking victim Jennifer Lawrence (Jim Spellman/WireImage)

Almost everything that happens on Reddit is controlled by users, not Reddit employees. The bulk of the action on the site occurs in "subreddits" devoted to particular topics. There's /r/technology for technology news, /r/funny for humor,  /r/aww for pictures of cute animals, /r/makeupaddiction for beauty tips, /r/soccer for discussion of the world's favorite sport, and so forth.

These subreddits — there are tens of thousands of them — are created and managed by Reddit users. Each has one or more moderators who are given the power to customize the look of the subreddit, establish rules for what can be posted, delete content that runs afoul of the rules, and ban users who flout community norms.

These norms can vary widely. /r/aww only allows happy pictures, not sad ones.  /r/makeupaddiction prohibits users from posting pictures that have been heavily photoshopped. Posts about politics aren't allowed on /r/funny, whereas discussion of tech-related political topics are a staple of /r/technology.

While most subreddits focus on mainstream, G-rated topics, there's also a fair amount of pornography. And a small minority of subreddits feature extreme content. There are subreddits called /r/cutefemalecorpses, /r/sexwithdogs, and /r/whiterights. No, I'm not going to link to them and yes, they're just as appalling as they sound.

Why would Reddit allow so much demented content on its site?

Reddit's management is committed to a broad interpretation of free speech. They believe that it isn't their job to police the morals of Reddit users. Instead, Reddit believes that each subreddit community has the right and responsibility to establish its own norms. People who feel uncomfortable in a particular subreddit are free to lobby its moderators to change the rules. If they're still unsatisfied, they can switch to other subreddits or start their own.

Reddit believes that each subreddit community has the right and responsibility to establish its own norms

Accordingly Reddit believes that decisions about whether to permit (or even encourage) racist, misogynistic, or sexually perverted content in a subreddit should be made by the users and moderators of that particular subreddit — not by Reddit employees.

"We believe that you — the user — have the right to choose between right and wrong, good and evil, and that it is your responsibility to do so," wrote Reddit CEO Yishan Wong in a Saturday blog post. "We consider ourselves not just a company running a website, but the government of a new type of community. The role and responsibility of a government differs from that of a private corporation, in that it exercises restraint in the usage of its powers."

Reddit's management draws an explicit analogy to the First Amendment, which bars the US government from restricting freedom of speech, even speech that's highly offensive. Reddit touts a similarly absolute commitment to free speech within its own community.

If Reddit is so committed to free speech, why did it ban /r/thefappening?

TheFappening was a subreddit created for discussion of the celebrity photo leaks, and it quickly became a hub for promoting wider distribution of the photos. Reddit itself doesn't host images, but users would upload the photos to third-party sites and then post links to them in /r/thefappening.

Thefappening was like a crack house that required constant monitoring by law enforcement

Reddit avoids censorship as much as possible, but the site does comply with US law. In particular, Reddit will remove content that infringes copyright or constitutes child pornography. Given that some of the photo subjects were underage at the time they were taken, and many were selfies to which the subject also owned the copyright, lawyers for the victims had legal grounds to demand their removal.

According to a Reddit administrator, things "quickly devolved into a game of whack-a-mole." Reddit would get a takedown request and remove content, only to have users upload another copy of the image and submit another link to /r/thefappening.

"It became obvious that we were either going to have to watch these subreddits constantly, or shut them down," Reddit says.

So according to Reddit management, the problem with TheFappening wasn't that the subreddit had objectionable content, or even that some of that content was potentially illegal. The problem was that users were submitting so much legally questionable content that it was draining Reddit's administrative resources. If the administrators are Reddit's police force, /r/thefappening was like a crack house that required constant monitoring by law enforcement.

Doesn't this approach privilege those who are wealthy enough to hire a lot of lawyers?



Model and hacking victim Kate Upton (Brad Barket/Getty Images)

That's what the critics argue. They point out that there are subreddits devoted to sharing nude images of non-celebrity women that may have been published without their knowledge or consent. But because these women don't have the resources of Jennifer Lawrence and Kate Upton, they aren't able to generate a blizzard of takedown requests.

Reddit critics also accuse the site of being unduly influenced by media attention. For example, Reddit used to have a subreddit called /r/jailbait that — unsurprisingly — attracted pornographic images of underage women. It was popular enough to win a "subreddit of the year" vote in 2008. It was shut down only after it was the subject of unflattering coverage on CNN. Reddit also banned a subreddit called /r/creepshots, dedicated to "upskirt" photographs, after a Gawker expose on its founder. But other subreddits with equally disturbing content but less media attention remain open for business.

Is Reddit hypocritical to protect the anonymity of its own users?

While Reddit has taken a relatively passive approach to removing stolen nude photos, it's more active in removing posts that "dox" another Reddit user — to expose their real name, address, phone number, and other personal information.

So why doesn't Reddit's absolutist free speech policy extend to this kind of information? Reddit says it has had a recurring problem where a Reddit user would be "doxed" and then face anonymous phone calls, unordered pizza deliveries, and other forms of harassment. To prevent this kind of misbehavior, Reddit bans users from posting anyone's personal information.

(Spencer Platt/Getty Images)

You might think stolen photos are in a similar category — women have obviously faced harassment after nude photographs were posted online. But Reddit says it's too difficult to know whether a photo has been posted with or without the consent of their subjects. So rather than trying to make a lot of tricky judgment calls, it just allows photos across the board.

Here again, critics say, Reddit seems to offer different levels of free speech to different people. The men who are helping to distribute stolen photos enjoy the benefits of Reddit's pro-anonymity guarantees. But Reddit has refused to do much to help the women in the photos, who have suffered a more severe invasion of their privacy.

Could hosting all of this questionable material put Reddit in legal hot water?

The content in certain subreddits could raise a wide variety of legal issues, from defamation, to violation of civil rights laws, to copyright infringement. But the law largely immunizes Reddit itself.

Section 230 of the 1996 Communications Decency Act gives online companies broad immunity for content posted by their users. There are a few exceptions, notably child pornography and intellectual property. Another statute, Section 512 of the 1998 Digital Millennium Copyright Act, shields sites like Reddit from copyright liability if they respond promptly to takedown requests from copyright holders.

Reddit wants to been as a passive provider of infrastructure, but to many people it looks like an absentee landlord

Reddit has hewed closely to these rules. The company removes child pornography whenever it finds it and it complies with requests under the DMCA to take down infringing content. (This option isn't always available to the subjects of stolen photos since they only have copyright if they took the picture.) But otherwise, the site is a free-for-all. And they're probably on safe legal ground.

Don't online services have a moral obligation to do more?

People disagree about this. And our intuitions on the subject depend a lot on context.

Think about Google, for example. While many users used /r/thefappening to find stolen celebrity nudes, many others used Google's search engine. And in operating its search engine, Google follows a policy much like Reddit's: it will take down material when it's legally required to do so. But otherwise it indexes the whole web and let users decide what to search and where to click. You can use Google to find not only the celebrity photos but a wide variety of other offensive content too.

Or consider Comcast. Presumably many of the men scouring the internet for the stolen pictures were Comcast customers who used Comcast's network to locate and download the images. Yet Comcast not only doesn't try to filter out these pictures, many people believe that broadband providers should be legally prohibited from blocking legal content flowing over its network.

But many people feel that the kind of neutrality that characterizes a network owner or search engine isn't appropriate for a company that supports the creation of online communities. Reddit wants to been as a passive provider of infrastructure, but to many people it looks like an absentee landlord who's allowing criminals to operate out of his property. The Verge's T.C. Stottek compares Reddit to a failed state that's unwilling or unable to maintain law and order within its borders.

Could Reddit do more to stamp out stolen celebrity photos and other questionable content?

(Axelle/Bauer-Griffin/GC Images)

They could certainly try. Right now, Reddit does the bare minimum the law requires. It could be more proactive about shuttering communities that promote illegal activities, banning users who post problematic content, and more actively filter posts themselves for racist, misogynistic, or otherwise problematic material.

But establishing a consistent and effective system for taking down questionable content could be challenging. And it's not clear how much good it would do.

The seemingly endless campaign against pirated material provides a cautionary tale. For more than a decade, Hollywood and the recording industry have been using their considerable legal and lobbying clout to force intermediaries to take greater responsibility for the infringing content their users distribute.

Content companies have shut down numerous websites that made it too easy to share copyrighted material. Some have even faced criminal penalties. Recording companies sued thousands of individual users who distributed material on peer-to-peer file-sharing networks. They've pressured credit card companies and ad networks to stop working with sites that promote infringing content. They've persuaded broadband providers to penalize users who repeatedly engage in piracy.

Despite all those efforts, which have consumed millions of dollars over more than a decade, copyright-infringing material remains widely available online. Every time an intermediary is shut down, new ones pop up in their place. Other users share files directly over peer-to-peer networks. As laws have been enforced more strictly in the United States, file-hosting services have increasingly shifted overseas.

A similar point applies to celebrities' stolen photos. There are millions of people who want to see them, and evidently, a critical mass of them are willing to defy authorities and social norms to help distribute them.

Despite all those efforts, copyright-infringing material remains widely available online

In the face of that kind of determined resistance, it would be a pretty big challenge just for Reddit to get the photos off its own website. And it's never going to be possible to scrub them from the internet as a whole. It may not even be feasible to make them very difficult to find.

This isn't to say that Reddit shouldn't change its policies. Even if banning racist, misogynistic, and necrophiliac communities from Reddit would simply move them to an even seedier corner of the internet, it might still help to reinforce social norms against the distribution of those types of content. And it would make it a little harder to find the photos. But not very much harder.

Why does US law give websites such strong legal protections in the first place?

As mentioned previously, there are two different provisions of federal law that give websites broad immunity from liability for the actions of their users. Both provisions were inserted at the insistence of large telecommunications providers who were worried that they would otherwise face ruinous liability for their users' actions. But they were written in a way that extended protection to online services generally.

Advocates argue that the law has two big benefits. First, it makes the US a more hospitable place for online innovation. If online services were directly liable for content submitted by their users, investors might hesitate to invest in startups like YouTube or Twitter that hosted a lot of user-generated content. Without the legal protections of US law, these sites might never have gotten off the ground — at least not in their present form.

Second, the legal protection that sites like Reddit and Google enjoy promote a culture of greater free speech online. That might not seem like such a great thing when it comes to the distribution of stolen photos, but there are other circumstances where such protections serve important public purposes. Imagine, for example, a citizen activist who had evidence of wrongdoing by a powerful public official and wanted to post this information on a blogging platform. In a world without liability protections, website hosts might be reluctant to host content that could expose them to a defamation lawsuit, and powerful figures could more easily intimidate web hosting providers into removing such content.

Show more