A new potential iOS 11.3 jailbreak 0day bug has just been discovered by an iOS security engineer. Here’s what that means for any possible future jailbreak.
[ Continue reading this over at RedmondPie.com ]
[ Continue reading this over at RedmondPie.com ]
When done wrong, higher performance hurts power efficiency. When done right, power efficiency enables higher performance.
Lithium-ion batteries are getting a lot of attention these days. Everything from the Galaxy Note 7 recall to, yes, Apple’s discounted battery replacement program for older iPhones, means that it’s top of mind for many customers. So, it’s not surprising the Exynos version of Samsung’s Galaxy S9 is making headlines for its battery performance this week, and not in a good way.
From Yonhap News Agency:
The battery performance of Samsung Electronics Co.’s Galaxy S9 smartphone trailed behind rival products, industry watchers said Monday, causing consumers to express discontent.
From the same report:
Phone Arena, another industry tracker, also said in its report that the battery of the Galaxy S9 lasted 7 hours and 23 minutes in its test, which is an hour below the Galaxy S8’s 8 hours and 22 minutes. Apple Inc.’s iPhone X and LG’s V30 held comparable figures of 8 hours and 41 minutes and 9 hours and 34 minutes, respectively.
AnandTech tests were even more brutal:
The Exynos 9810 Galaxy S9 absolutely fell flat on its face in this test and posted the worst results among our tracking of the latest generation devices, lasting 3 hours less than the Exynos 8895 Galaxy S8. This was such a terrible run that I redid the test and still resulted in the same runtime.
Yonhap theorizes that Samsung Electronics has become more conservative about battery capacity, given the catastrophic failures consumers experienced with the Galaxy Note 7. But capacity alone is seldom, if ever, an issue, as Yonhap itself explains:
“Although the battery’s capacity is also important, the phone’s optimization algorithm is very crucial,” an industry insider said. “The Galaxy S9 came with various new features, which possibly led to more stand-by power consumption. (Samsung) may have failed to develop power-saving algorithms properly.”
It’s possible there’s something going wrong at the system level that’s just burning power, even when it shouldn’t. That’s the best case scenario for everyone.
Otherwise, it’s a bigger problem, but one that’s probaly simpler than conservatism. At least partially. Samsung recently chose to care about single core performance. It’s something Apple has cared about and architected for for years. Samsung has obviously seen the advantage and is now making it a priority as well. But, drastically increasing single core performance has cost. Absent a dye shrink or previous generations being so inefficient that there was significant room for improvement, that cost is power consumption.
AnandTech put it this way:
This is such a terrible battery performance of the Exynos 9810 variant that it again puts even more clout into the new SoC. My theory as to why this happens is that not only do the higher frequency state require more energy per work done than competing SoCs – because this is a big CPU complex there’s also lots of leakage at play. The DVFS system being so slow might actually be bad for energy here as we might be seeing the opposite of race-to-sleep, walk-to-waste. The fact that Apple’s SoCs don’t have any issues with battery life in this test showcases that it’s not an inherent problem of having a high-power micro-architecture, but rather something specific to the Exynos 9810.
Unfortunately it feels like S.LSI keeps being one generation behind when it comes to efficiency – the A72 beating the M1, the A73 beating the M2 and now the A75 beating the M3. If you were to shift the microarchitectures one year ahead in Samsung’s favour then suddenly we would have had a much better competitive situation. What needs to happen with the M4 is a much larger efficiency boost to remain competitive with ARM’s upcoming designs and actually warrant the use of an internal CPU design team. Currently a 17-22% performance lead does not seem worth a 35-58% efficiency disadvantage along with the 2x higher silicon area cost.
The same feels like it applies to Samsung and Apple. Samsung perpetually feels a generation behind when it comes to efficiency.
Android Central‘s take:
That sort of battery performance is abysmal for a flagship phone like the Galaxy S9.
The truth is, hardware is tough. And silicon is especially tough.
Given infinite time, any good silicon team could design a system-on-a-chip that would achieve maximum performance at maximum efficiency up to the limits of known physics in our universe. Release schedules are the opposite of infinite time, though. You get a few years to plan, but you have to ship every year.
What Apple’s done to meet that demand is to establish a solid foundation and to build and iterate on it each and every year.
Apple A7 was the first 64-bit ARM chip in a phone. Apple A10 Fusion introduced paired efficiency and performance cores, so that reaching higher wouldn’t leave a gap beneath. Apple A11 Bionic increased the performance of the efficiency cores, while also introducing a neural engine and everything required to support Face ID. And all with at least the same, sometimes better battery life.
It’s not just a multi-year plan, it’s a multi-year investment.
To complicate matters, unlike Apple, Samsung has chosen to use two different chipsets for its phones: Exynos, which is made by Samsung’s silicon company, and Snapdragon, which is made by Qualcomm. It’s the Exynos version specifically that’s experiencing these problems.
With Apple and its consistent processor architecture per year per device, everything is a known quantity. This season’s iPhone X and iPhone 8, for example, all run on the same Apple A11 Bionic system-on-a-chip (SoC), on every carrier, in every region.
That means every component, from the rest of the hardware to all of the software, is a known quantity, and can work as part of an integrated whole to eek out as much performance while maintaining as much efficiency as possible.
Having two silicon targets just means, as opposed to infinite time, you have half the time to optimize for each.
Meanwhile, customers have come to expect phones that are faster and more battery efficient than ever. Oh, and lighter too.
(Internet experts love to talk about increasing battery size, but lightness is an incredibly important part of usability — never mind thermal insulation and RF interference, no one wants to buy the heaviest phone on the carrier shelf.)
Meeting and exceeding those expectations is a huge challenge. Pack a battery wrong and it burns. Boost performance wrong and it burns out.
But architect it right and the performance doesn’t come at the expense of the efficiency. The efficiency enables the performance.
Not everyone thinks about these things when buying a phone, or when arguing about specs on Twitter.
But it’s clear Apple is thinking about it deeply. And it shows in iPhone X, where the custom silicon drives everything from the machine learning-based biometrics to the display technology to the industry leading performance and, yes, the power efficiency that allows for extended battery life as well.
And it’s absolutely something consumers should think about not just when buying a phone but when investing in platform.
Tech can can help equalize opportunities in education, the Mayor says.
Chicago Mayor Rahm Emanuel explained how he sees Apple helping Chicago public school students learn how to code after the company’s education-themed keynote on Tuesday.
Emanuel spoke with Recode’s Kara Swisher after Apple’s event at Lane Tech College Prep High School, where Apple announced its new partnership with Chicago Public Schools and Northwestern University to train local computer science teachers in coding.
“Apple is an important part of making computer coding universal and making sure kids have that,” said Emanuel. “There’s 6,000 school districts across the United States. Every one of them would be excited to have Apple.”
Apple is creating a Center for Excellence at Lane Tech where Northwestern University trainers will provide free technical education to local high school teachers through Apple’s Everyone Can Code program as well as training on Apple’s programming language, Swift. The company says the program is an effort to address the shortage of high school computer science teachers.
Chicago Public Schools made coding a requirement for high school graduation back in 2015 — the first urban school district to do so — and has educational partnerships with other tech companies such as Cisco and IBM, the Mayor said.
Still, Emanuel emphasized that technology should never supplant the fundamentals of education.
“Technology doesn’t replace literature, it should complement it,” said Emanuel. “Sometimes there is an overemphasis on technology as if the other stuff is not necessary,” he said.
In his interview with Recode, Emanuel also discussed his passionate support for Dreamers and the importance of privacy online. You can watch the full video below:
To learn more about Apple’s plans around education and job training, watch Tim Cook’s interview on “Revolution: Apple Changing the World,” a TV collaboration between Recode and MSNBC that is scheduled to air on Friday, April 6 at 8 pm ET.
Residents of Shenzhen don’t dare jaywalk.
Since April 2017, this city in China’s Guangdong province has deployed a rather intense technique to deter jaywalking. Anyone who crosses against the light will find their face, name, and part of their government ID number displayed on a large LED screen above the intersection, thanks to facial recognition devices all over the city.
If that feels invasive, you don’t even know the half of it. Now, Motherboard reports that a Chinese artificial intelligence company is partnering the system with mobile carriers, so that offenders receive a text message with a fine as soon as they are caught.
The system is just one cog in the vast surveillance machine that the Chinese government has been building over the last several years. Its aim is in part public safety and security, but the information on citizens’ whereabouts and activities will also feed into China’s national social credit system.
The social credit system, which will roll out in 2020, is intended to rate individuals according to a national scoring system for how trustworthy a citizen each person is. Citizens with a low score might be refused certain jobs, pay more for certain services, and according to China’s National Development and Reform Commission, even be banned from traveling. According to Fortune, that travel restriction could kick in from a wide range of offenses, from spreading fake information about terrorism to smoking on the train.
It’s like an Orwellian fever dream, we know.
The facial recognition scheme may a humdrum, run-of-the-mill aspect of life for citizens who have no expectation of privacy. But in fact, it does seem the Chinese people care about privacy. After the CEO of Baidu, China’s largest search engine, publicly said, “If they [Chinese] are able to exchange privacy for safety, convenience or efficiency, in many cases, they are willing to do that, then we can make more use of that data,” a heated online uproar ensued.
Still, we don’t blame you if the concept of social credit feels unsettling. It’s not exactly likely that the Chinese government will be transparent about how they decide what leads to deductions or restrictions. Human Rights Watch (HRW) states that “the near future for human rights appears grim” in China, given the country’s history of muzzling free speech and punishing those who speak out against the government; though HRW makes no mention of the social credit system, it’s possible that the system could lead to a continuation, or exacerbation, of these practices in the future.
Yes, the technology deployed around China could be useful. But the nation’s unsettling track record serves as a reminder of how its power could easily be abused.
The post If You Jaywalk in China, Facial Recognition Means You’ll Walk Away With A Fine appeared first on Futurism.
Roughly 65 million light-years away from Earth is a galaxy called NGC 1052-DF2 (DF2 for short). But DF2 may as well be called F-U, because that’s what it’s saying to scientists who thought they understood galaxies, dark matter, and really anything about our universe.
What makes DF2 so special, you may ask? It appears to contain virtually no dark matter.
We’ve never seen dark matter directly. We only believe dark matter exists because we can see how it affects “regular,” or baryonic, matter. Based on these indirect observations, researchers have estimated that dark matter makes up about 27 percent of our universe.
Since dark matter was (sort of) discovered, researchers assumed dark matter was essential to galaxy formation. Dark matter would clump together. Then, the gravity from those clumps would attract baryonic matter, forming the stars, planets, and other objects we can actually see within a galaxy. Easy, right?
Based on this understanding, the team studying DF2 thought they had a pretty good idea how much dark matter it contained. But when they calculated how much dark matter DF2 actually had, they discovered it contained only 1/400th the amount they expected.
“It challenges the standard ideas of how we think galaxies work,” Pieter van Dokkum, a Yale University professor and lead author of a paper on DF2, now published in Nature, said in a press release. “This result also suggests that there may be more than one way to form a galaxy.”
DF2 is unique in other ways, too. It doesn’t fit the characteristics of a spiral galaxy, which typically have dense, central regions, spiral arms, and a disk. But it also isn’t like known elliptical galaxies, which have a black hole at their center.
Instead, DF2 is a rare ultra-diffuse galaxy. “It’s so sparse that you see all of the galaxies behind it,” van Dokkum said. “It is literally a see-through galaxy.”
This might seem counterintuitive, but DF2 actually supports the existence of dark matter, which some theories argue doesn’t exist.
“For those kinds of theories, it wouldn’t be possible to ever have a galaxy that looks as though it doesn’t have dark matter,” Jocelyn Monroe, a particle physicist and dark matter expert at Royal Holloway, University of London, who was not involved in the study, told The Verge. “So [this galaxy is] really interesting for the potential it has to exclude some of these ideas.”
The researchers hope to pin down the age of DF2. “At the moment, we only know its older than 10 billion years, but we’d like to know if it’s 10 billion years old or 13 billion years old, which is right after the Big Bang,” van Dokkum told ABC.
If DF2 does end up being 13 billion years old, it could rack in another superlative: the oldest galaxy ever discovered.
The post Scientists Found a Galaxy With Almost No Dark Matter. Here’s What That Means. appeared first on Futurism.
Your local rag has far more value to your community than reporting on the best pancake joint or the complaints of that guy who had his yard TP’d. True story, read all about it: Local newspapers are actually valuable tools for science.
A recent STAT article highlighted one surprising use of local newspapers: tracking the outbreak of infectious disease. Epidemiologists use local papers to identify outbreaks in their infant stages — way before they’re big enough to make national papers — and forecasting how they might evolve.
For example, computational epidemiologist Maia Majumder told STAT local newspapers were essential when she and her colleagues at the HealthMap disease projection project tried figuring out the source of a 2016-2017 outbreaks of mumps in northwestern Arkansas. While it was difficult to get data from the Arkansas Department of Public Health, the Northwest Arkansas Democrat-Gazette freely provided Majumder with the context she needed: the region had the highest rate of vaccine refusal in the state, and that the disease was spreading within a community of Marshall Islands immigrants even though they’d been vaccinated.
Yet local newspapers are also vanishing in many places, thanks to falling readership. A recent data project by the Columbia Journalism Review shows that many parts of the United States, particularly in the Midwest, Southeast, and Alaska, have zero local newspapers to rely on. Epidemiologists are worried that this data gap could lead to researchers missing outbreaks, or create gaps in their patterns of how diseases spread, which can make it difficult to control.
But disease outbreaks aren’t the only data point that scientific research can target using local newspapers. Local papers have also been essential in tracking the impacts and unspoken threats from broader changes, like climate change.
In Houston, The Texas Tribune became famous for a seemingly “psychic” article that predicted the city’s unchecked growth and proximity to a warming Gulf of Mexico would soon leave it vulnerable to a hurricane. A little more than a year later, Hurricane Harvey hit, devastating Houston. But, it was pointed out, the Tribune‘s writers weren’t consulting a crystal ball when they wrote their piece; “It was the natural outgrowth of great journalism by reporters who know their subjects and communities well and have covered these issues extensively.”
Reporters that know and can follow up on changes and rumors only told around town as local reporters can are able to see trends too minimal for major papers to pick up on. They also track shifts based on local interest, providing record of change as a process (rather than one only characterized by disaster). For example, research found that local newspapers have increased their sea level rise coverage at a higher rate than that of larger papers since 2012, with The Miami Herald’s coverage of the topic passing up that of The New York Times.
The context that local news provides is especially important given the “shifting baselines” that come with climate change. This term refers to our tendency to adjust our expectations based on what we see as our current reality. As eloquently described by fisheries scientist Daniel Pauly, who coined the term in 1995: “We transform the world, but we don’t remember it. We adjust our baseline to the new level, and we don’t recall what was there … Every generation will use the images that they got at the beginning of their conscious lives as a standard and will extrapolate forward.”
A recent survey by the Society of Environmental Journalists found that nearly 7 of 10 respondents were “very interested” in covering the local angle of climate change, but nearly 6 of 10 said downsizing in their organization makes it more difficult to do so. Making that possible is up to readers everywhere. Journalism is part of our collective memory, but without support of local news, we risk having some serious gaps in recall.
[ Continue reading this over at RedmondPie.com ]
The Galaxy S9 and S9+ have rear cameras with adjustable aperture settings, allowing the phones to take photos at either f/1.5 or f/2.4. This may mean something to you, but in the event it doesn’t, let me provide a brief explainer.
You might already be familiar with aperture values, or F-stops, as they relate to a camera’s lens. Like the pupil of your eye, the size of a camera’s aperture determines the amount of light that gets in through the lens to the image sensor.
The Galaxy S9 and S9+ have dual-aperture cameras – here’s what that means was written by the awesome team at Android Police.
Bought an Amazon Echo last Christmas? You were not the only one. According to a statement released by Amazon, “millions” of Echo Dots — the manufacturer’s smart home devices featuring voice assistant Alexa — were sold during the holidays, making it the best-selling item on their website. Indeed, virtual assistants like Siri and Alexa are increasingly being used in everyday life — and this changes the way consumers browse the web. Gartner predicts that by 2020, 30 percent of our browsing sessions will be voice conducted. In addition, audio-centric technologies such as Apple’s AirPods, Google Home and Amazon’s Echo, are…
Disney is one of the biggest media players in the game.
As chief strategy officer at Disney, Kevin Mayer runs strategy, biz dev and M&A for the media giant, and he’s the man who put together three amazing purchases: Pixar, Marvel and Lucasfilm. Now he’s engineering Disney’s colossal Fox deal, as well as the company’s move to stream its stuff directly to consumers. Watch his full onstage interview at Recode’s Code Media below: