The United Kingdom’s Advertising Standards Authority banned one of HTC’s ads that have been running on social media since mid-last year. The ad in question features Olympic diver, Tom Daley, diving into a pool, feet first (after a flip or two), holding the HTC U11 over his head, taking selfies on his decent into the pool, and after being seen exit the pool with the phone in hand. . Despite having a disclaimer at the bottom, it wasn’t enough for the ASA to deem the ad as misleading. It highlighted the fact that the product’s instructions explicitly warn users against intentionally…
A couple of weeks ago, Google started rolling out a new Play Store redesign on the web. On the face of it, it looked a little cleaner, but try to use it for one minute and you’d hate everything about it. Now it appears that Google has pulled it back. None of us can get the new design to show up again, regardless of how many listings and browsers and devices we try on; we’re all back to the previous Play Store look on the web.
Google seems to have pulled back the web Play Store redesign (thank goodness!) was written by the awesome team at Android Police.
Facebook is obviously in some very hot water as details regarding Cambridge Analytica's use of its users' data continue to unfold. And along with heated consumer backlash and questions from lawmakers, Facebook may now start to lose advertising money….
Engadget RSS Feed
Two days ago, director Michael Jacobs described his first VR short, GFE (an acronym of "girlfriend experience"), to me on camera. The film is a "documentary fantasy," Jacobs said, with a focus on "demystifying escort work and bringing a sense of empo…
Engadget RSS Feed
One of the first third-party keyboards for iOS, Swype, has reached its end of life, and will see no more updates or development effort by its parent company.
AppleInsider – Frontpage News
There has been a recent increase in accidents caused by distracted drivers behind the wheel, and the number one culprit of distraction is the cell phone. Having these super-fast mobile computers in our pockets really entice us to take them out while waiting at a light or when stuck in traffic. In France, a new court ruling from the Court of Cassation has altered the current texting and driving laws that are in effect. Specifically, the law states : The use of a telephone held in hand by the driver of a vehicle in circulation is prohibited. This court ruling has changed the legal…
Today we’ve got a new replacement program for iPhone 7, the explanation behind Telegram being abruptly pulled from the App Store last week, and a few updates on HomePod and Apple Music.
Popular secure messaging app Telegram was removed from the App Store last week — and now we officially know why.
The primary Telegram app and Telegram X (which is in testing) were both removed from the iOS App Store on Jan. 31. The next day, Telegram CEO Pavel Durov tweeted that the app was removed due to the presence of “inappropriate content.”
No other reason was given, but Durov added that once “protections” were in place, the Telegram app would reappear on the iOS app storefront. Indeed, the Telegram app returned to the App Store on Feb. 2.
But today, we’re getting a clearer picture of what “inappropriate content” caused the secure messaging platform to be taken down.
In an email in response to a 9to5Mac reader, Apple marketing chief Phil Schiller reported that Apple’s App Store team was alerted to “illegal content,” specifically child pornography, being shared through Telegram.
“After verifying the existence of the illegal content, the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children),” Schiller wrote in the now-verified email.
Presumably, the Telegram apps returned to the App Store with protections to stop the illegal content from being spread. Apple’s App Store guidelines require platforms to contain filters for objectionable material and the capability to report it, as well as the ability to block users, TechCrunch reported.
“We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity,” Schiller added. “Most of all, we have zero tolerance for any activity that puts children at risk — child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral.”
Distribution of child pornography is among the most grievous offenses on the internet, and the vast majority of social networks, tech platforms and websites include mechanisms to immediately detect it and remove it. Telegram, apparently, wasn’t as prepared, however. So Apple removed the app while its developer figured out how to eradicate the issue.
The secure messaging app is host to a suite of advanced security features that allow users to host private and secret conversations that are end-to-end encrypted. Notably, it was one of the first messaging platforms to feature end-to-end encryption when it launched in 2013.
But while the ability to hold secret conversations may be Telegram’s main selling point, it’s also its primary flaw, as the app has faced issues in the past with terrorism and terrorist-related content. The platform and its developers have been widely criticized by governments for being the “app of choice” for terrorist organizations like ISIS, Vox reported in June 2017.
Telegram was nearly banned by the Indonesian government for “terrorist-related content,” and the developers were forced to create a moderator team to tackle the content in the country, The Verge reported.
When Apple temporarily pulled Telegram from the App Store over "inappropriate content," it left many wondering just what that content was. We now know: 9to5Mac has learned that the company removed the app after discovering that people had been distr…
Engadget RSS Feed
Telegram’s developer provided few details as to why its secure messaging applications briefly disappeared and reappeared in Apple’s App Store several days ago, but Apple today confirmed that it pulled the apps for a serious reason: Telegram was serving child pornography to users, and it wouldn’t be allowed back in the App Store until the issue was fixed.
According to a report from 9to5Mac, App Store chief Phil Schiller said that Apple had been alerted that Telegram’s apps were sharing child pornography, which Apple verified, removing the apps and notifying authorities. Rather than remaining passive about the problem, Apple then pushed the developer to remove the content, ban the users responsible for posting it, and install “more controls to keep this illegal activity from happening again.”
Apple’s removal of Telegram from the App Store coincided with seemingly minor updates to the developer’s Android apps, suggesting that nothing serious was amiss until Apple said otherwise. Telegram has long promised users ultra-secure communications that cannot be read even by foreign governments, but it has been targeted by Iranian state-sponsored hackers, and more recently criticized by Russian authorities for facilitating terrorism. Telegram has previously brushed off complaints about bad uses of its service, suggesting that dangerous users will simply change apps, and truly blocking them would require blocking the internet.
Nonetheless, Apple clearly believed that Telegram could do more to police bad users, and apparently leveraged the possible loss of its iOS user base to force a rapid change. Schiller explained in an email that Apple “will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity… we have zero tolerance for any activity that puts children at risk.”
Apple’s action on behalf of at-risk children comes several weeks after activist investors asked the company to do more to protect children from “iPhone addiction,” and CEO Tim Cook suggested that social media might share blame for device “overuse.” Schiller’s email is reproduced below.
The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).
The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content. Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store.
We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk — child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral.
I hope you appreciate the importance of our actions to not distribute apps on the App Store while they contain illegal content and to take swift action against anyone and any app involved in content that puts children at risk.