IoT Podcast: Apple’s HomePod and chip news galore

An Intel NUC board beloved by the Industrial IoT.

Get out the guacamole, because you’re going to hear a lot about chips on this week’s Internet of Things Podcast! ARM announced a new architecture for machine learning called Trillium and said it would license an object detection design and one that could handle some basic training at the edge. Amazon, too, is building a chip for its edge devices and machine learning will certainly have a part to play.

Also on this week’s podcast, Stacey and Kevin cover Intel’s smart glasses, Kevin’s opinions on the Apple HomePod and Google’s new IoT hire. They also answer a listener’s question about using different profiles with the Amazon Echo.

The guest this week is Alexandros Marinos, who is the CEO of Resin.io. He discusses the popular hardware platforms for prototyping, the industrial IoT and an up-and-coming platform that is breaking out because of interest in machine learning. He also talks about the similarities and differences between servers and connected devices as it relates to building software to manage them. You’ll learn that servers are like cattle, not like pets.

Listen here:

Stacey on IoT | Internet of Things news and analysis

AI Smartphones Will Soon Be Standard, Thanks to Machine Learning Chip

AI Built In

Almost every major player in the smartphone industry now says that their devices use the power of artificial intelligence (AI), or more specifically, machine learning algorithms. Few devices, however, run their own AI software. That might soon change: thanks to a processor dedicated to machine learning for mobile phones and other smart-home devices, AI smartphones could one day be standard.

British chip design firm ARM, the company behind virtually every chip in today’s smartphones, now wants to put the power of AI into every mobile device. Currently, devices that run AI algorithms depend on servers in the cloud. It’s a rather limited set up, with online connectivity affecting how information is sent back and forth.

Project Trillium would make this process much more efficient. Their built-in AI chip would allow devices to continue running machine learning algorithms even when offline. This reduces data traffic and speeds up processing, while also saving power.

“We analyze compute workloads, work out which bits are taking the time and the power, and look to see if we can improve on our existing processors,” Jem Davies, ARM’s machine learning group head, told the MIT Technology Review. Running machine learning algorithms locally would also mean fewer chances of data slipping through.

A Staple for Mobile Phones

With the advantages machine learning brings to mobile devices, it’s hard not to see this as the future of mobile computing. ARM, however, isn’t exactly the first in trying to make this happen. Apple has already designed and built a “neural engine” as part of the iPhone X’s main chipset, to handle the phone’s artificial neural networks for images and speech processing.

Google’s own chipset, for their Pixel 2 smartphone, does something similar. Huawei’s Mate 10 packs a neural processing unit developed by the Chinese smartphone maker. Amazon might follow soon with its own AI chips for Alexa.

A diagram showing how Project Trillium will develop chips for AI smartphones, beginning with ground-up design, progressing to uplift from processors, and enabled by open-source software, ending in a processor that targets the mobile market.
Image credit: ARM

The MIT Tech Review notes, however, that ARM’s track record for energy-efficient mobile processors could translate to a more widespread adoption of their AI chip. ARM doesn’t actually make the chips they design, so the company has started sharing their plans for this AI chip to their hardware partners — like smartphone chipmaker Qualcomm. ARM expects to find their machine learning processor in devices by early 2019.

The post AI Smartphones Will Soon Be Standard, Thanks to Machine Learning Chip appeared first on Futurism.

Futurism

MIT has a new chip to make AI faster and more efficient on smartphones

Just one day after MIT revealed that some of its researchers had created a super low-power chip to handle encryption, the institute is back with a neural network chip that reduces power consumption by 95 percent. This feature makes them ideal for bat…
Engadget RSS Feed

MIT’s new chip could bring neural nets to battery-powered gadgets

The Best Guide To Selling Your Old Phones With High Profit

 MIT researchers have developed a chip designed to speed up the hard work of running neural networks, while also reducing the power consumed when doing so dramatically – by up to 95 percent, in fact. The basic concept involves simplifying the chip design so that shuttling of data between different processors on the same chip is taken out of the equation. The big advantage of this new… Read More
Mobile – TechCrunch

MIT’s low power encryption chip could make IoT devices more secure

The Internet of Things hasn't ever been super secure. Hacked smart devices have been blamed for web blackouts, broken internet, spam and phishing attempts and, of course, the coming smart-thing apocalypse. One of the reasons that we haven't seen the…
Engadget RSS Feed