Facebook’s Hidden Policies and Purposes
Facebook is currently dealing with an ethics issue concerning whether they have control of the content being posted on their site after a recent live incident on the social media website, according to the New York Times.
A Cleveland man shot an innocent bystander, Robert Godwin Sr., 74, last Monday while using Facebook Live, where more than two million watched the video before it was removed. According to the New York Times, the content was still up and viewable for more than two hours after the incident, causing he victim’s grandson, Ryan A. Godwin, to urge people on Twitter to not share the video.
Please please please stop retweeting that video and report anyone who has posted it! That is my grandfather show some respect #Cleveland
— Ryan A. Godwin (@god_winr) April 16, 2017
From the recent F8 convention in San Jose, held by Facebook’s research lab about their latest technology, the website currently is the top social media site, with more than a billion average users per day. Facebook argues that they have an inability to filter their content according to their terms and services in order to provide users with “the right to post anything they feel like contributing to the social media site.”
Other social media websites do not follow the same protocol. Instagram, MeetMe and Tagged all list their terms of service explicitly instructing users to “not impersonate or gain from certain content” on their site, including inappropriate photos or videos.
From their recent promises of new technology that ranges from helicopters called Tether-Tenna, which would provide larger internet connection to most remote locations in the United States, to typing up a message without having to touch a keyboard with certain brainwaves, Facebook is facing scrutiny from users from their inability to take down content.
YouTube currently runs a “Content ID” tool that immediately identifies copyrighted videos with the site’s algorithm based on audio and video of the original creator’s content, according to eff.org. Twitter proved that it has the ability to take down content as quickly as ten minutes after posting during the Olympic coverage, according to vocativ.com, proving the algorithm exists on other social media sites.
Currently Facebook is looking to hire a higher official to filter their content for fake news being spread on the website. The same website could take the person’s identity and information and sell it to any advertiser of their choosing according to The Guardian, but could not filter inappropriate content quickly.
Another issue with the ability to post any unfiltered content on the site is the free use of the content Facebook has at the client’s expense. In the same terms of service agreement, anyone that uses the website agrees that anything they post on the website is free for Facebook to use, but also their personal information, content and search history is accessible for Facebook to sell to advertisers.
An example of this is if a user posts a picture of themselves on the beach, anyone who is interested on that picture could freely take that picture for use in advertising opportunities or even impersonation, although algorithms currently run on many social media platforms to prevent spam and fake bots that impersonate people.
“Students need to be more aware of the terms and agreement because that alone took me by surprise,” UCO student Jennifer Phillips said.