In addition to our standalone articles covering the latest news and rumors at MacRumors, this Quick Takes column provides a bite-sized recap of other headlines about Apple and its competitors on weekdays.
A wearable being developed at MIT’s Media Lab knows what its wearer is going to say before any sound is made.
The AlterEgo device uses electrodes to pick up neuromuscular signals in the jaw and face that are triggered by internal verbalisations – all before a single word has been spoken, claim MIT’s researchers.
Every one of us has an internal monologue of sorts, a place where our most intimate thoughts come and go as they please. Now, thanks to sophisticated sensors and the power of machine learning, the act of saying words in your head might not be so private after all.
MIT believes that the simple act of concentrating on a particular vocalisation is enough to engage the system and receive a response, and it has developed an experimental prototype that appears to prove it.
To ensure that the conversation remains internal, the device includes a pair of bone-conduction headphones. Instead of sending sound directly into the ear, these transmit vibrations through the bones of the face to the inner ear, conveying information back to the user without interrupting the normal auditory experience.
Arnav Kapur, the graduate student who is leading development of the new system at MIT’s Media Lab, wants to augment human cognition with more subtlety than today’s devices allow for. “Our idea was: Could we have a computing platform that’s more internal, that melds human and machine in some ways, and that feels like an internal extension of our own cognition?” he said.
Kapur’s thesis advisor, Professor Pattie Maes, points out that our current relationship with technology – particularly smartphones – is disruptive in the negative sense. These devices demand our attention and often distract us from real-world conversations, our own thoughts, and other things that should demand greater attention, such as road safety.
“We basically can’t live without our cellphones, our digital devices,” she said. “But at the moment, the use of those devices is very disruptive. If I want to look something up that’s relevant to a conversation I’m having, I have to find my phone and type in the passcode and open an app and type in some search keyword, and the whole thing requires that I completely shift attention from my environment and the people that I’m with to the phone itself.”
The challenge is to find a way to alter that relationship without sacrificing the many benefits of portable technology.
“So, my students and I have for a very long time been experimenting with new form factors and new types of experience that enable people to still benefit from all the wonderful knowledge and services that these devices give us, but do it in a way that lets them remain in the present,” she said.
Instead of being a precursor to some kind of Orwellian dystopia, the MIT team believes that the technology, once perfected, could improve the relationship between people and the devices they use, as well as serving a variety of practical functions.
So far the device has been able to surreptitiously give users information on the time and solve mathematical problems. It’s also been given wearers the power to win chess games, silently receiving opponents’ moves and offering computer-recommended responses, claims MIT.
The team is still collecting data and training the system. “We’re in the middle of collecting data, and the results look nice,” Kapur said. “I think we’ll achieve full conversation some day.”
The platform could one day provide a way for people to communicate silently in environments where noise is a concern, from runway operators to special forces soldiers. And it could perhaps even open up a world of verbal communication for people who have been disabled by illness or accident.
The rise of voice search in the US – where 20 percent of all searches are now voice-triggered, according to Google – together with the rapid spread of digital assistants, such as Siri, Alexa, Cortana, Google Assistant, and IBM’s new Watson Assistant, has shifted computing away from GUIs, screens, and keyboards. And, of course, smartphones and tablets have moved computers off the desktop and out of the office, too.
However, while voice is the most intuitive channel of human communication, it isn’t suitable for navigating through, and selecting from, large amounts of visual data, for example, which is why technophiles are always drawn back to their screens.
This new interface will excite many, and may have a range of extraordinary and promising applications. But doubtless it will alarm many others as the rise of AI forces us to grapple with concepts such as privacy, liability, and responsibility.
And let’s hope, too, that this technology doesn’t always translate what’s on human beings’ minds into real-world action or spoken words, as the world could become a bizarre place indeed.
In the meantime, transhumanists will see this as yet another example of the gradual integration of technology with biology – and with good reason. But whether these innovations will encourage us to become more human, and less focused on our devices, is a different matter; arguably, such devices may train human beings to think and behave in more machine-like ways to avoid disorderly thinking.
Meanwhile, thoughts that can be hacked? Don’t bet against it.
We don’t usually cover infographics and comparisons on Android Police – most are biased, pointless, or without much credibility – but this latest data set from appfigures made us stop and do a double-take because it presented interesting stats from a rather reputable app analytics company.
appfigures‘ report looked at 10 different questions regarding the Play Store and App Store and tried to answer them with stats pulled from its Explorer app research tool.
Ten years ago, Steve Jobs announced the App Store. While its first titles were mostly games and novelties, soon major businesses began to recognize the power of mobile apps, shifting major investment from desktop PCs and web apps into iOS. This year, Apple is inciting new enterprise investment in iMessage Apps with Apple Business Chat–billed as an interactive, personal way to connect with customers while respecting their privacy. AppleInsider – Frontpage News
Amid growing pressure to remove bad actors from Facebook, CEO Mark Zuckerberg said Wednesday that the company would likely release more information about problematic content posted to the service during elections. But to ensure the accuracy of the data, Zuckerberg said, the reports will likely come after the elections are over. The move could help government officials, academic researchers, and concerned citizens understand whether Facebook’s increased attention to abuse is working — but the timing could make it harder for grasp what’s happening when it arguably matters most.
Between the glowing blue and yellow swirls of distant galaxies, this tiny pinprick of light doesn’t look like much: a white smudge on the infinite black of the universe.
But this tiny speck has enormous significance for astronomers. It’s the most distant star ever seen, affording astronomers a glimpse back in time.
The star, MACS J1149+2223 Lensed Star 1 (more simply known as “Icarus”) was about 9 billion light years away when it emitted the light now reaching Earth. Most other objects spotted at this distance are either galaxies or exploding stars (AKA supernovas), which produce much more light than this distant glimmer.
Thanks to the constant expansion of the universe, Icarus would now be much further away from our planet; by now, it’s probably gone supernova itself, and formed either a black hole or neutron star. (For why we can still view it, though, see #3.)
Here are four things you should know about this distant galactic neighbor, and why we’re just seeing it for the first time.
1. Spotting Icarus was a stroke of good luck
Icarus is so far away that we technically shouldn’t be able to see it: it’s about 100 times further away than the most distant star telescopes have been able to view before now. Fortunately, astronomers got a little bit of help from the universe in spotting it (and the Hubble telescope, props to that).
Icarus was visible because of an astronomical phenomenon called gravitational lensing. In short, the gravity of large, stacked-up celestial objects (in this case, a cluster of galaxies) bend light, creating a magnifying glass-effect for anything behind them. Overall, researchers told The Guardian, Icarus was magnified more than 2,000 times.
Icarus also got a special boost from an extra-magnifying star within the galaxy cluster, making it appear four times brighter over the course of the time the astronomers studied it. Thank you, physics.
2. The star is a blue supergiant
Icarus would be an oddity in the universe — if it were still around. Analysis of the star’s light showed it was a blue supergiant, one of the hottest and highest-mass stars we know of; the blue supergiant Rigel A, the bright left “foot” of the constellation Orion, is 23 times more massive than the sun, and estimated to be several hundred thousand times brighter.
Stars like Icarus and Rigel are rare in the universe today, but in the early universe, they were common; according to io9, most of the early stars were blue supergiants at some point in their lives.
That makes sense, since Icarus’ distant light is actually somewhat like a time machine.
3. Icarus gives a view back in time
The universe is way, way bigger than you can probably comprehend. And because of this astronomical (sorry) size, it can take a really long time for light to reach Earth from the cosmic wilderness. Even traveling at its immense speeds, by the time light from this distant star reached Earth, 9 billion years had passed.
When Icarus released the photons currently hitting the Hubble’s cameras, Earth hadn’t even formed yet — it would be another 4.4 billion years before our solar system even began to coalesce from the dust of the universe. Such distant views of the universe are helping astronomers learn about what the universe was like before our time, even giving us glimpses back to the moments after the Big Bang.
4. The view let scientists test dark matter theory
The Guardian reports that the team also used their view of Icarus to test a theory about dark matter, the mysterious substance that makes up 27 percent of the universe (its counterpart, dark energy, makes up another 68 percent). One theory proposed that dark matter was made of black holes, but what the researchers saw of Icarus didn’t support that theory — looking back at a decade of Hubble images, they didn’t see Icarus’ brightness vary over time. If the black-hole-dark-matter theory was correct, the star would have appeared brighter.
In the coming years, scientists hope to peer even further into our universe’s history with more powerful telescopes, like the James Webb Space Telescope and the Wide Field Infrared Survey Telescope (WFIRST). Recent budget cuts from the White House threatened the future of WFIRST. If the government was unsure just how much these space telescopes could accomplish, this discovery from their predecessor might serve as an apt reminder.
This is normal. Heart pounding, hands shaking, head packed with static. The absolute inability to process what anyone is saying, let alone respond to it. Sitting alone at home — lights off because you've been inside all day and the sun set hours ag… Engadget RSS Feed
Spotify’s first day of trading as a public company ended up being surprisingly normal — and that’s a win for its “direct listing” strategy. About 30 million shares of Spotify’s 178 million outstanding shares traded hands yesterday, and after a drop of about 11 percent from the opening price of $ 165.90, the stock was trading at around $ 149 by afternoon; at the end of the day, Spotify was considered by Wall Street to be a $ 27 billion company. [Theodore Schleifer and Rani Molla / Recode]
Apple has hired Google’s chief of search and artificial intelligence. John Giannandrea, who helped lead the push to integrate AI throughout Google’s products, will run Apple’s machine learning and AI strategy, and will be one of 16 executives who report directly to CEO Tim Cook. The hire is a major coup for Apple, which many Silicon Valley executives and analysts view as lagging its peers in artificial intelligence. [Jack Nicas and Cade Metz / The New York Times]
Make time for this wide-ranging conversation with Saudi Crown Prince Mohammed bin Salman, who is on a seemingly endless pilgrimage to the nodes of American power — he visited Silicon Valley and Hollywood this week. The prince’s current U.S. visit is mainly a hunting trip for investment, and an opportunity for him to sell his so-called Vision 2030, an elaborate, still mainly unexecuted plan to modernize the Kingdom of Saudi Arabia and end its dependence on oil. [Jeffrey Goldberg / The Atlantic]
A lot has been made about Apple’s possible shift to the A-series processor in the Mac starting in 2020 — but this isn’t the first time that Apple has convinced a generation to change hardware architectures. AppleInsider – Frontpage News