Log in
Forgot password ?
Become a member for free
Sign up
Sign up
New member
Sign up for FREE
New customer
Discover our services
Dynamic quotes 

MarketScreener Homepage  >  Equities  >  Nasdaq  >  Facebook, Inc.    FB


Real-time Estimate Quote. Real-time Estimate Cboe BZX - 04/14 03:32:19 pm
303.68 USD   -1.96%
12:31pIreland launches inquiry into Facebook after reports of data leak
11:07aInstagram launches test where users can choose to see likes
10:50aHow Amazon Strong-Arms Partners Using Its Power -2-
SummaryMost relevantAll NewsAnalyst Reco.Other languagesPress ReleasesOfficial PublicationsSector newsMarketScreener Strategies

Facebook : Preventing Child Exploitation on Our Apps

02/23/2021 | 12:34pm EDT

Using our apps to harm children is abhorrent and unacceptable. Our industry-leading efforts to combat child exploitation focus on preventing abuse, detecting and reporting content that violates our policies, and working with experts and authorities to keep children safe.

Today, we're announcing new tools we're testing to keep people from sharing content that victimizes children and recent improvements we've made to our detection and reporting tools.

Focusing on Prevention

To understand how and why people share child exploitative content on Facebook and Instagram, we conducted an in-depth analysis of the illegal child exploitative content we reported to the National Center for Missing and Exploited Children (NCMEC) in October and November of 2020. We found that more than 90% of this content was the same as or visually similar to previously reported content. And copies of just six videos were responsible for more than half of the child exploitative content we reported in that time period. While this data indicates that the number of pieces of content does not equal the number of victims, and that the same content, potentially slightly altered, is being shared repeatedly, one victim of this horrible crime is one too many.

The fact that only a few pieces of content were responsible for many reports suggests that a greater understanding of intent could help us prevent this revictimization. We worked with leading experts on child exploitation, including NCMEC, to develop a research-backed taxonomyto categorize a person's apparent intent in sharing this content. Based on this taxonomy, we evaluated 150 accounts that we reported to NCMEC for uploading child exploitative content in July and August of 2020 and January 2021, and we estimate that more than 75% of these people did not exhibit malicious intent (i.e. did not intend to harm a child). Instead, they appeared to share for other reasons, such as outrage or in poor humor (i.e. a child's genitals being bitten by an animal). While this study represents our best understanding, these findings should not be considered a precise measure of the child safety ecosystem. Our work to understand intent is ongoing.

Based on our findings, we are developing targeted solutions, including new tools and policies to reduce the sharing of this type of content. We've started by testing two new tools - one aimed at the potentially malicious searching for this content and another aimed at the non-malicious sharing of this content. The first is a pop-up that is shown to people who search for terms on our apps associated with child exploitation. The pop-up offers ways to get help from offender diversion organizations and shares information about the consequences of viewing illegal content.


The second is a safety alert that informs people who have shared viral, meme child exploitative content about the harm it can cause and warns that it is against our policies and there are legal consequences for sharing this material. We share this safety alert in addition to removing the content, banking it and reporting it to NCMEC. Accounts that promote this content will be removed. We are using insights from this safety alert to help us identify behavioral signals of those who might be at risk of sharing this material, so we can also educate them on why it is harmful and encourage them not to share it on any surface - public or private.


Improving Our Detection Capabilities

For years, we've used technology to find child exploitative content and detect possible inappropriate interactions with children or child grooming. But we've expanded our work to detect and remove networks that violate our child exploitation policies, similar to our efforts against coordinated inauthentic behavior and dangerous organizations.

In addition, we've updated our child safety policiesto clarify that we will remove Facebook profiles, Pages, groups and Instagram accounts that are dedicated to sharing otherwise innocent images of children with captions, hashtags or comments containing inappropriate signs of affection or commentary about the children depicted in the image. We've always removed content that explicitly sexualizes children, but content that isn't explicit and doesn't depict child nudity is harder to define. Under this new policy, while the images alone may not break our rules, the accompanying text can help us better determine whether the content is sexualizing children and if the associated profile, Page, group or account should be removed.

Updating Our Reporting Tools

After consultations with child safety experts and organizations, we've made it easier to report content for violating our child exploitation policies. To do this, we added the option to choose 'involves a child' under the 'Nudity & Sexual Activity' category of reporting in more places on Facebook and Instagram. These reports will be prioritized for review. We also started using Google's Content Safety APIto help us better prioritize content that may contain child exploitation for our content reviewers to assess.


To learn more about our ongoing efforts to protect children in both public and private spaces on our apps, visit facebook.com/safety/onlinechildprotection.


Facebook Inc. published this content on 23 February 2021 and is solely responsible for the information contained therein. Distributed by Public, unedited and unaltered, on 23 February 2021 17:33:01 UTC.

© Publicnow 2021
All news about FACEBOOK, INC.
12:31pIreland launches inquiry into Facebook after reports of data leak
11:07aInstagram launches test where users can choose to see likes
10:50aHow Amazon Strong-Arms Partners Using Its Power -2-
09:34aJohnson & Johnson Vaccine Pause Sparks Scrambles Across Country
09:18aKetchup-Packet Savers Squeeze Them for Profits
06:17aHow the pandemic helped Walmart battle Amazon Marketplace for sellers
06:06aAMAZON COM  : Only Multilateral Cooperation Can Stop Harmful Tax Competition
05:01aFACEBOOK  : splashes out $23m on Mark Zuckerberg's security
02:56aDGAP-ADHOC : GN Store Nord A/S: GN Store Nord upgrades financial guidance for 20..
04/13NEWS HIGHLIGHTS : Top Company News of the Day
More news
Financials (USD)
Sales 2021 108 B - -
Net income 2021 32 582 M - -
Net cash 2021 78 317 M - -
P/E ratio 2021 27,5x
Yield 2021 -
Capitalization 880 B 880 B -
EV / Sales 2021 7,44x
EV / Sales 2022 6,02x
Nbr of Employees 58 604
Free-Float 83,9%
Duration : Period :
Facebook, Inc. Technical Analysis Chart | MarketScreener
Full-screen chart
Technical analysis trends FACEBOOK, INC.
Short TermMid-TermLong Term
Income Statement Evolution
Mean consensus BUY
Number of Analysts 52
Average target price 337,14 $
Last Close Price 309,76 $
Spread / Highest target 24,3%
Spread / Average Target 8,84%
Spread / Lowest Target -29,0%
EPS Revisions
Managers and Directors
Mark Elliot Zuckerberg Chairman & Chief Executive Officer
David M. Wehner Chief Financial Officer
Michael Todd Schroepfer Chief Technology Officer
Atish Banerjea Chief Information Officer
Sheryl Kara Sandberg Chief Operating Officer & Director
Sector and Competitors
1st jan.Capitalization (M$)
FACEBOOK, INC.13.40%880 109
TWITTER, INC.33.80%57 674
MATCH GROUP, INC.-3.33%39 310
BUMBLE INC.0.00%6 865
NEW WORK SE-11.07%1 670
GREE, INC.-4.30%1 134