Facebook Under Fire for Bizarre Child Predator Survey Question

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

Facebook has come under fire after posing a survey question on how it should deal with predatory sexual behavior against children over the weekend. Survey participants were asked whether sexual predators should be allowed to request photographic images from 14-year-old girls online. Further, users were queried about how Facebook should handle such a request if it learned about it. The survey also asked whether the site should better manage content involving extremist behavior and whether cultural norms should be taken into account.
TechNewsWorld
Cash For Apps: Make money with android app

iPhone and Apple Watch Emergency SOS feature save woman, child after collision

Article Image

After a horrifying collision with a drunk driver at a stoplight, a woman saved her life by calling for help with her Apple Watch with its SOS feature.
AppleInsider – Frontpage News

Apple briefly pulled Telegram over child pornography distribution

When Apple temporarily pulled Telegram from the App Store over "inappropriate content," it left many wondering just what that content was. We now know: 9to5Mac has learned that the company removed the app after discovering that people had been distr…
Engadget RSS Feed

Apple pulled Telegram from App Store over child pornography


Telegram’s developer provided few details as to why its secure messaging applications briefly disappeared and reappeared in Apple’s App Store several days ago, but Apple today confirmed that it pulled the apps for a serious reason: Telegram was serving child pornography to users, and it wouldn’t be allowed back in the App Store until the issue was fixed.

According to a report from 9to5Mac, App Store chief Phil Schiller said that Apple had been alerted that Telegram’s apps were sharing child pornography, which Apple verified, removing the apps and notifying authorities. Rather than remaining passive about the problem, Apple then pushed the developer to remove the content, ban the users responsible for posting it, and install “more controls to keep this illegal activity from happening again.”

Apple’s removal of Telegram from the App Store coincided with seemingly minor updates to the developer’s Android apps, suggesting that nothing serious was amiss until Apple said otherwise. Telegram has long promised users ultra-secure communications that cannot be read even by foreign governments, but it has been targeted by Iranian state-sponsored hackers, and more recently criticized by Russian authorities for facilitating terrorism. Telegram has previously brushed off complaints about bad uses of its service, suggesting that dangerous users will simply change apps, and truly blocking them would require blocking the internet.

Nonetheless, Apple clearly believed that Telegram could do more to police bad users, and apparently leveraged the possible loss of its iOS user base to force a rapid change. Schiller explained in an email that Apple “will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity… we have zero tolerance for any activity that puts children at risk.”

Apple’s action on behalf of at-risk children comes several weeks after activist investors asked the company to do more to protect children from “iPhone addiction,” and CEO Tim Cook suggested that social media might share blame for device “overuse.” Schiller’s email is reproduced below.

The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).

The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content. Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store.

We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk — child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral.

I hope you appreciate the importance of our actions to not distribute apps on the App Store while they contain illegal content and to take swift action against anyone and any app involved in content that puts children at risk.

Apple – VentureBeat

Telegram was pulled because of child pornography, says Apple’s Phil Schiller

Article Image

The "inappropriate content" that saw Telegram briefly disappear from the App Store last week was child pornography, Apple’s marketing chief explained in response to a customer question.
AppleInsider – Frontpage News

Telegram iOS app removed from App Store last week due to child pornography

Enlarge

The encrypted messaging app Telegram was mysteriously removed from Apple’s App Store last week for a number of hours. At the time, little was known about the reason why, except that it had to do with “inappropriate content.” According to a 9to5Mac report, Apple removed Telegram after the app was found serving up child pornography to users.

A verified email from Phil Schiller details that Apple was alerted to child pornography in the Telegram app, immediately verified the existence of the content, and removed the app from its online stores. Apple then notified Telegram and the authorities, including the National Center for Missing and Exploited Children. Telegram apps were only allowed to be restored to the App Store after Telegram removed the inappropriate content and reportedly banned the users who posted it.

“The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content,” the email reads. “Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store.”

Read 3 remaining paragraphs | Comments

apple – Ars Technica

Counter-Strike co-creator arrested, suspended from Valve for sexual exploitation of a child


Seattle’s KIRO-7 news outlet reports that 36-year-old Jess Cliffe, who co-created the immensely popular multiplayer FPS  franchise Counter-Strike and is an employee at gaming giant Valve, has been arrested for sexual exploitation of a child. Details are scant, but the channel noted that a booking over the aforementioned charges typically points to creating images and/or video of child sexual abuse. Charges haven’t yet been filed against Cliffe, and he is expected to attend a bail hearing on Friday. Valve said in a statement to Ars Technica that it’s still learning about what actually happened and that his employment has been suspended…

This story continues at The Next Web
The Next Web

Child welfare advocates protest Messenger Kids — can Facebook meet them halfway?


Facebook’s feet are being held to the fire again — this time, its being criticized by those hoping to protect children. Child advocacy group Campaign for a Commercial-Free Childhood (CCFC) today wrote an open letter to Facebook’s Mark Zuckerberg today in which they outlined their objections to Facebook’s child-centered app, Messenger Kids. The letter has been signed by more than 100 organizations and individuals. The CCFC wraps up the letter by asking Facebook to delete Messenger Kids, but let’s be realistic — Facebook’s not going to do that. The company is desperate to pull in the previously-untapped audience of under-13s…

This story continues at The Next Web

Or just read more coverage about: Facebook
The Next Web

YouTube’s poor AI training led to rise of child exploitation videos

YouTube uses algorithms and human moderators, but it still couldn't prevent the rise in disturbing, child-exploitative videos on the platform. Why? Well, it's likely due to various reasons — one of them, according to a BuzzFeed report, is the confus…
Engadget RSS Feed

YouTube pulls autocomplete results that showed child abuse terms

YouTube has been working hard lately to fix issues around child exploitation and abuse. The Google-owned video service revamped its policies and their enforcement around videos featuring minors or family-friendly characters in disturbing situations….
Engadget RSS Feed