• Skip to primary navigation
  • Skip to content
  • Skip to primary sidebar
  • Home
  • About
  • DMCA
  • Privacy Policy
  • Contact US
  • Sitemap

Internet Do

Internet - All things Internet

  • Internet
  • Technology
  • Domain – Host – VPS
  • WEB – BLOG
  • Bitcoin
  • Reviews
You are here: Home / Technology / Facebook plans customizable filters for nudity and violence – TechCrunch

Facebook plans customizable filters for nudity and violence – TechCrunch

February 16, 2017 by admin



Facebook wants to give you the power to define what is and isn’t objectionable, and influence the local defaults who don’t choose voluntarily. You’ll be able to select how much nudity, violence, graphic content, and profanity tyou’re comfortable seeing.

Mark Zuckerberg revealed this massive shift in Facebook’s Community Standards policy today in his 5,000-word humanitarian manifesto, which you can read our highlights and analysis of here.

Currently, Facebook relies on a one-size-fits-most set of standards about what’s allowed on the network. The only exception is that it abides by local censorship laws. But that’s led to trouble for Facebook, as newsworthy historical photos including nudity and citizen journalism accounts of police violence have been wrongly removed, then restored after media backlash or executive review.

Zuckerberg explains the new policy, writing:

“The idea is to give everyone in the community options for how they would like to set the content policy for themselves. Where is your line on nudity? On violence? On graphic content? On profanity? What you decide will be your personal settings. We will periodically ask you these questions to increase participation and so you don’t need to dig around to find them. For those who don’t make a decision, the default will be whatever the majority of people in your region selected, like a referendum. Of course you will always be free to update your personal settings anytime.

With a broader range of controls, content will only be taken down if it is more objectionable than the most permissive options allow.”

This approach allows Facebook to give vocal, engaged users choice, while establishing reasonable localized norms, without ever forcing specific policies on anyone or requiring all users to configure complicated settings.

To classify potentially objectionable content Facebook will lean more heavily on artificial intelligence, which is already delivering 30% of all content flags to its human reviewers. Over time, Zuckerberg hopes Facebook’s AI will learn to make nuanced distinctions, such as between terrorist propaganda and a news report about a terrorist attack.

There are still plenty of questions about how this system will work. For example, what happens to teens? Do they get strict defaults or the same control, and do parents have license to select their kids’ settings?

This new system of governance could make Facebook’s policies feel less overt, as they should align with local norms. It might also be a boon to certain content creators, such a photographers or painters that make nude art, videographers who capture action or war, or unfiltered pundits with niche views.

Personalized and localized site governance might prove more democratic than treating Facebook as one giant country. Its 2012 experiment with allowing people to vote on policies failed and was scrapped because it required 30% of a users to vote on long, complicated documents of changes for their majority decision to be binding. The final vote would have needed 300 million votes to be binding, but received just 619,000. Now users who don’t “vote” on their settings receive the local defaults, “like a referendum” in a U.S. state.

photo-mark-zuckerberg-talking-about-his-letter-to-the-community-at-facebooks-internal-quarterly-all-company-meeting

Zuckerberg also outlined several other product development plans. Facebook hopes to add more suggestions for local Groups to tie users deeper into their communities. Facebook will also give Group leaders more tools, akin to what Faebook provides Page owners. Zuckerberg didn’t provide specifics, but those features might include analytics about what content is engaging, the ability to set more types of admins and moderators, or the option to add outside app functionality.

As far as safety and information, Facebook wants to expand AI detection of bullying or self-harm, and potentially allow people to report mental health issues, disease, or crime. And to fight polarization and sensationalism, not just objectively fake news, it wants to present users with a range of sources across the political spectrum about a given topic. That could potentially come through showing Related Articles on links that draw on sources from other parts of the spectrum.

The central theme of these changes is Facebook empowering users to define their own experience. It wants to see the world move towards a supportive, safe, informed, civically engaged, and inclusionary global community. But it still sees itself as just a tool, with the direction of progress defined by those who wield it.



Source link

Share this:

  • Facebook
  • Twitter
  • Google
  • Tumblr
  • Reddit
  • Pinterest
  • LinkedIn

Related

Filed Under: Technology Tagged With: Tech news, technology

Primary Sidebar

Recent Posts

  • Crypto’s Craigslist: LocalBitcoins Is Making Millions Where Bitcoin Is Needed Most
  • Make or Break? Bitcoin Nears Key Resistance Hurdle at $8,300
  • $19 Billion Square’s Market Cap Surges 15% Due to its Bitcoin Venture
  • What Bitcoin Is Really Worth May No Longer Be Such a Mystery
  • Bitcoin Today: Prices Hold Gains Above $8,000
  • Craig Wright Moves to Dismiss ‘Shakedown’ Bitcoin Lawsuit
  • Quebec Chief Scientist: Bitcoin ‘Not A Magnet For Illicit Transactions’

Categories

  • Bitcoin (1,816)
  • Domain – Host – VPS (370)
  • Internet (26,491)
    • Images (233)
    • Videos (277)
  • Make Money Online (11)
  • Reviews (6)
  • Technology (31,486)
  • WEB – BLOG (685)

INTERNET - ALL THINGS INTERNET © 2015 - 2017.