2013-05-13

Debate: Should it be YouTube’s responsibility if  someone posted an inflammatory video or is it the responsibility of the person who posted  the video? 

Introduction

YouTube is the number one website that hosts video. It is said that users watch 4 billion hours worth of video each month, and upload 72 hours worth of video every minute. YouTube was started in 2005 and in the last eight years it has grown to be the largest website of its kind.[1] Anyone can create an account and upload any type of video that they want for anyone to view, simply and easily and for free. This has led to the phenomenal success of YouTube, with millions of videos enjoyed by millions of people. Unfortunately, there are a percentage of users who upload distasteful videos. Those distasteful videos have caused riots and spread to people who copy the horrible acts in those videos. This has become a problem for the world because the world should be a place of peace and love towards one another. There are already enough bad messages in the world and people do not need to see any more bad messages, especially being promoted on YouTube that hundreds of millions of people use. Just because YouTube is a place where anything can be uploaded that does not mean that people should use it in negative ways. But, just like many others examples in life, people abuse and take advantage of a good thing and ruin it for everyone else. It is lousy that a few bad apples can spoil the entire barrel. The situation needs improvements but whose responsibility is it? Is it YouTube as a company or the people who uploaded the irresponsible videos? Those questions lately have brought up much controversy and a dilemma on the best way to handle the issue fairly.

 

There are certain privacy settings that someone can enable while they are filling out the information YouTube asks for before they launch their video. However, the privacy settings are not that advanced and do not block videos that promote violence, hate towards others, or deal with inflammatory topics. That is why YouTube is getting the blame for those types of videos being on their website because they allow for those types of videos to be posted, simple as that.

 

The Beginning of YouTube and its Power

There are so many different types of videos that people post from personal home movies, movie trailers, television shows, music videos that people have created and the official music videos that the artist has created, captioned lyrics that come on the screen (like karaoke), comedy shows, documentaries, How-To videos, DIY (Do-It-Yourself), stage performances and there are so many others. The possibilities of all the different videos that can be seen on YouTube are endless. YouTube has also jump started the careers of many celebrities that are famous like Justin Bieber and Psy who sings the song “Gangam Style.” [2] YouTube has literally created a whole new world and has made video the biggest way to view media. Former PayPal employees Chad Hurley, Steven Chen, and Jawed Karim founded YouTube and there were discussions of its development a year before it’s official launch on February 14th, 2005. Later in October 2006, Google, its co-founders Larry Page and Sergey Brin, saw the potential of YouTube and bought the video-sharing website for $1.65 billion dollars.[3] At that time Google’s CEO Eric Schmidt called it “the next step in the evolution of the Internet.” [4]

 

This was the start of big things to come for YouTube and everyone wanted in on a piece of the action. People were very excited about how video was taking over and people were uploading their videos at a rapid rate because before YouTube there was not a website that was geared to watching videos, yet alone a website that made it possible for someone to post their own videos to. YouTube created a phenomenon and video was the new way everything was going to be seen. But being able to post whatever you want causes a problem when it come to people who post sexual videos, hate speech videos, bullying videos, samples of violent video games, violent movies people have made, videos of fights that broke out, videos with lots of cursing and other videos of that nature. Let’s say a fight breaks out somewhere and it gets recorded and the video gets posted to YouTube. The video then goes viral and, as a result of that violent fight video, people start emulating the fight that they saw on YouTube and people get seriously injured. In a case like that, who is responsible?  Is it YouTube for being the host of that video, or is it the fault of the person who posted that video to YouTube? That is where the controversy lies and where YouTube needs to watch its back and know how to protect itself and at the same time take steps to improve the video environment.

 

Hundreds of millions of users across the globe come to YouTube to discover and shape the world through video. The first-ever video to be uploaded to YouTube is a video of a young guy at the zoo and the video was entitled “Me at the Zoo.” That video which was uploaded on April 23, 2005 has been watched 1.96 million times and is only 19 seconds long.[5] YouTube is the host website that allows for people to have their spot in the limelight and it is an excellent feeling when you see the number of times your video has been viewed reach a higher number. Everyone wants their video to be viewed as many times as possible or they would not have uploaded and shared it with the world. But not all content on YouTube should be up there. YouTube is just the host website and “ordinary” people are the reason that so many of the videos hurt others, cause problems for themselves and others, and are just in poor taste. YouTube needs to figure out a way that they can scan a video and look at it to see if it will ultimately get them in hot water for it being on their pages and being among their collection of videos.

 

Privacy Settings on YouTube

The current privacy settings allow the user who posts a video to require his or her approval of the comments made by viewers before the comments are shared with the public. This allows for the comments to be scanned to see if they are appropriate and in good taste. Other privacy settings include having the option to choose “public” and share your videos with the world, or you can choose “private”, limited to a select group of 25 people such as friends.[6]  YouTube has just added another level of privacy called “unlisted” that makes it easier to share clips with selected people without actually having to set the video to private. This “unlisted” option lets a poster of a video only share it with certain people by giving them the URL link to the video so the viewer is able to watch it on YouTube but it is not included in YouTube’s searchable database and the video will not show up in YouTube’s search results.[7] This “unlisted” option will allow posters who create questionable videos to still have people whom they chose to be able to watch their video, but by not being searchable, the video is not “out in the open” where it can easily be searched and possibly cause a stir among many people, with bad repercussions as a result. This new privacy feature causes a gray area for inflammatory videos because those videos now have a way of disguising themselves so they are not easily searchable and “flagged” but nonetheless they can still be found within YouTube’s website and video library and sent around the internet using the URL link.

 

Community Guidelines, Flagging and Reporting Abuses

Critics say, that YouTube’s community guidelines are not strict enough and many videos remain posted. An extremely important way that YouTube reduces the number of problematic videos is with its “flagging” tool. Registered users can easily click on the flag tool and report offensive videos that (1) show nudity particularly of a sexual nature (2) hurt, attack or humiliate (3) have speech that demeans a certain group (4) incite violence or encourage dangerous illegal activities and (5) show gory or violent acts.[8]  Once reported, the YouTube’s flag channel says that its staff works quickly to get the videos removed. An additional idea would be to mark delisted videos (with a “Scarlet” X, covering a still image over the video) and issue reports to the YouTube community to show progress being made. When you sign up for YouTube you must agree to the terms of service, which include community guidelines. When they are violated YouTube can terminate the users account.

 

Anti-Muslim Video

An article on the Forbes website talks about Google, who owns YouTube, blocking a bizarre Anti-Islam Film trailer, which was 71 minutes in length, that was posted on YouTube in Egypt and Libya. The film is entitled  “The Innocent Prophet,” which was a movie produced by Pastor Terry Jones with collaboration from an ex-Muslim named Imran Firasat. The film had the ability to enrage Islamist groups across the globe and, as a result, major riots had broken out and many people were injured badly. The film targeted the Prophet Muhammad and even the most moderate and liberal thinking Muslims took offense.[9] YouTube has their website all over the world and has different versions of their website in other countries. Even though we can still see this video in America, the video has been temporarily blocked from YouTube where people in Egypt and Libya cannot view it. Google works very hard to create a public space that everyone can upload and watch whatever they enjoy and that allows for people to be able to express their different opinions. This has caused problems because what people in one country might find okay, people in another country might find wrong and offensive. The Anti-Islam film trailer is within YouTube’s guidelines and as a result will stay on YouTube for Americans to see but, because of the difficult situations that are taking place in Libya and Egypt, YouTube has restricted access to the video in both of those countries.

 

Censoring

It is a not an easy decision for a company to decide what gets censored and what does not. Sometimes it is too late to censor a video after it has already gone viral and people have formed their opinions. But some people have stated that they are against YouTube’s decision to take down “the Innocent Prophet” video because it does not violate YouTube’s current terms of service and it is within the rights of a person’s free speech. People have also said that taking the video down was a poor decision, which was made more due to the fear of a problem breaking out than anything else. YouTube simply took the video down in those two countries because the company did not know of any other better way to handle the situation. The video though, was already all over the Internet so censoring the video after it went viral is like “covering your ears an eyes as your house burns around you.” The problem has to do with religious intolerance and violent reactions to the video are going to occur whether people acknowledge they exist or not. However, now that YouTube has restricted access to that video in those countries, the protests about the video’s negative effects have stopped. That is where companies such as Twitter, Facebook, and Google are being drawn together in global debates regarding people’s access to inflammatory material. In another example, on May of 2012, the Pakistani government blocked people’s access to Facebook over the offensive cartoons that were drawn of the prophet Muhammad. Two years before that incident, Pakistan temporarily cut off people’s connection to Facebook after a Seattle cartoonist mocked that Thursday be nicknamed “Everybody Draw Mohammed Day.” [10]

 

New Directions

When YouTube starts deciding to actively censor its content (videos), the company is heading in a slippery direction that will cause every video to have to keep to the same moral codes or policies. YouTube was built on allowing everyone to be able to post whatever they wanted and that was people’s freedom of expression. Now YouTube because of some of the videos that have been posted is going to have to enforce a stronger platform regarding what defines people’s freedom of expression. People hope that the more active role in censorship is temporary and instead they keep that in the future of things to begin shaping up and moving in a positive directions that will suit the public in a way that will not cause too much of an uproar. Everyone wants to be able to say what they want but sometimes people cross the lines and, without rules that are clearly stated in writing on YouTube, the company will not be able to do anything about inflammatory videos. That is why YouTube needs to put clearer rules and terms of service and assign fault to people who chose to put provocative and offensive videos into place. Even with rules, there are people who break them because YouTube’s community guidelines are not obvious enough. New rules would make it so that it will no longer be the fault of YouTube but rather the fault of the person who uploaded the video. But once a video is already posted, YouTube’s decision to remove or block people from viewing it is too late because the effects of it are already out of the bag. All YouTube has proven is that they can block a video based on people’s reactions and people deciding that they are offended. A video needs to be blocked before people see it and that means that YouTube needs to take a more active role in checking their videos before they are published and become public for the world to see. Each video that is uploaded should not be uploaded immediately but rather take 24 hours before it becomes published so that YouTube’s staff has a chance to scan it for any inflammatory problems. In theory that sounds like a good idea but, with the amount of videos that are posted everyday, unless YouTube hired thousands of more people, they will never be able to keep up with the growing volume of videos that are posted daily. YouTube cannot be responsible for screening every last video on their website because that would be too big a problem for even a huge company to manage and much too expensive. One way to make it possible to review all videos within a day of their posting is to enroll YouTube users to review the videos, not for cash compensation but with some type of rewards points, coupons or other incentives provided by advertisers.  It may be possible to enroll thousands of YouTube users to be “watchers” and “flaggers.” Wikipedia has proven that volunteers do an awesome job at editing. Even with a revamp of their video guidelines and other changes, the ultimate fault should fall on the shoulders of the person who uploaded the video for not obeying YouTube’s rules and regulations. For example, let’s say you drive drunk and speed recklessly and get into an accident and hurt someone, you should be responsible and not blame the town for not posting enough “speed limit signs.” Believe me, stricter guidelines are not going to stop the whole problem but it is a start to making people realize how their videos can affect people in negative ways.

 

YouTube cannot allow everything, because there is a fine line of what exactly is appropriate freedom of speech. Just because people have their right to free speech that does not mean they can use it to slander others or post videos that will clearly cause controversy and lead to people possibly portraying what they watched in their own lives. Afterall, you have the right to say “fire” but don’t do it in a crowed movie theater or nightclub. YouTube, because of how big the company has grown, will not catch everyone who posts inflammatory videos, but now with this topic being a controversial issue and certain steps being taken, people need to understand that YouTube is going to be much more aware of what is being posted on their website. The company, due to past incidents, is now keeping a more careful eye out for videos that will offend others and cause problems in a world where we all need to survive, peacefully without hate.

 

Conclusion

As with any activity that involves millions of people, the responsibility for appropriate behavior is the shared responsibility of many parties, including people who post videos, YouTube the company, and we the large community of users. Every party involved is responsible and each party should do their part to help fix the controversial problem of inflammatory videos being on YouTube. The blame cannot be put on just one party because each party contributed to the situation mutually and all of the parties mutually are needed in order for the videos on YouTube to stay safe and in good taste.

 

[1] Rogowsky, Mark. “Here’s Looking At YouTube, Kid: Eight Years, One Billion Users, And Really Just Getting Started.” Forbes. Forbes Magazine, 21 Mar. 2013. Web. 16 Apr. 2013.

[2] Dickey, Megan Rose. “The 22 Key Turning Points In The History Of YouTube.” Business Insider. N.p., 15 Feb. 2013. Web. 16 Apr. 2013.

[3] “YouTube.” Wikipedia. Wikimedia Foundation, 15 Apr. 2013. Web. 16 Apr. 2013.

[4] “Google Buys YouTube for $1.65 Billion.” Msnbc.com. Associated Press, 10 Oct. 2006. Web. 16 Apr. 2013.

[5] Dickey, Megan Rose. “The 22 Key Turning Points In The History Of YouTube.” Business Insider. N.p., 15 Feb. 2013. Web. 16 Apr. 2013.

[6] Siegchrist, Gretchen. “YouTube Privacy Settings – Maintain Your Privacy on YouTube.” About.com Desktop Video. N.p., n.d. Web. 16 Apr. 2013.

[7] Lowensohn, Josh. “YouTube Gets Useful ‘Unlisted’ Video Option.” CNET News. CBS Interactive, 12 May 2010. Web. 23 Mar. 2013.

[8] “YouTube Community Guidelines.” YouTube. YouTube, n.d. Web. 30 Mar. 2013.

[9] Hill, Kashmir. “Google Blocks Bizarre Anti-Islam Film Trailer On YouTube In Egypt and Libya.” Forbes. Forbes Magazine, 13 Sept. 2012. Web. 16 Apr. 2013.

[10] “Pakistan Bans Facebook in Outrage over Online Competition to Draw Prophet Mohammed.” Mail Online. Mail Foreign Services, 20 May 2010. Web. 16 Apr. 2013.

 

Show more